Commit 8b7114e6 authored by Alexander Pace's avatar Alexander Pace
Browse files

Modified apache-config and robots.txt

The AWS containers were misconfigured so that every robot
imaginable was able to scrape download whatever they wanted.
Pumping the brakes on that and will see what happens.
parent e378aa06
Pipeline #238795 passed with stages
in 17 minutes and 40 seconds
......@@ -82,7 +82,7 @@ ServerName ${DJANGO_PRIMARY_FQDN}
Require all granted
Alias /robots.txt /home/gracedb/gracedb_project/static_root/robots.txt
Alias /robots.txt /app/gracedb_project/static_root/robots.txt
<Location /Shibboleth.sso>
SetHandler shib
# Custom robots.txt file. Modified by AEP 2021/06/03.
# Block everything from everyone:
#User-agent: *
#Disallow: /
# I would like the public events page to be public and searchable
# by search engines. That way, at least the SID's will show up.
# Everything else should be hidden. Documentation should be
# visible too.
User-agent: *
Disallow: /
Disallow: /api/
Disallow: /superevents/
Disallow: /events/
Disallow: /search/
Disallow: /alerts/
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment