Block search engines via robots.txt

Prevents instances from being rate limited due to being senselessly
crawled by search engines. Since there is no reason to index Nitter
instances, simply block all robots. Notably, this does *not* affect link
previews (e.g. in various chat software).
This commit is contained in:
minus 2022-06-04 12:03:07 +02:00
parent 778c6c64cb
commit afbb700d2b
1 changed files with 2 additions and 0 deletions

2
public/robots.txt Normal file
View File

@ -0,0 +1,2 @@
User-agent: *
Disallow: /