diff options
author | Bryan Newbold <bnewbold@robocracy.org> | 2020-08-19 23:00:34 -0700 |
---|---|---|
committer | Bryan Newbold <bnewbold@robocracy.org> | 2020-08-19 23:00:36 -0700 |
commit | c6b33542398c933a6272586e6280f7026b63a124 (patch) | |
tree | f95b5eb848ea3eceff910cd44b172e86f0332c1b /python/fatcat_web/static/robots.deny_all.txt | |
parent | 5f282a6267182214080ca36bcec4da1755589b46 (diff) | |
download | fatcat-c6b33542398c933a6272586e6280f7026b63a124.tar.gz fatcat-c6b33542398c933a6272586e6280f7026b63a124.zip |
update robots.txt and sitemap.xml
- show minimal robots/sitemap if not in prod environment
- default to allow all in robots.txt; link to sitemap index files
- basic sitemap.xml without entity-level links
Diffstat (limited to 'python/fatcat_web/static/robots.deny_all.txt')
-rw-r--r-- | python/fatcat_web/static/robots.deny_all.txt | 7 |
1 files changed, 7 insertions, 0 deletions
diff --git a/python/fatcat_web/static/robots.deny_all.txt b/python/fatcat_web/static/robots.deny_all.txt new file mode 100644 index 00000000..b88274b1 --- /dev/null +++ b/python/fatcat_web/static/robots.deny_all.txt @@ -0,0 +1,7 @@ +# Hello friends! + +# You have found a QA/development instance of the Fatcat catalog. The canonical +# location is https://fatcat.wiki, please crawl and index that location instead. + +User-agent: * +Disallow: / |