diff --git a/docs/admin/scraper_deterrence.md b/docs/admin/scraper_deterrence.md index 51203379b..c1cdbbff8 100644 --- a/docs/admin/scraper_deterrence.md +++ b/docs/admin/scraper_deterrence.md @@ -7,4 +7,8 @@ Once a solution to this challenge has been provided, by refreshing the page with The outcomes of this, (when enabled), is that it should make scraping of your instance's profile / status pages economically unviable for automated data gathering (e.g. by AI companies, search engines). The only negative, is that it places a requirement on JavaScript being enabled for people to access your profile / status web views. -This was heavily inspired by the great project that is [anubis](https://github.com/TecharoHQ/anubis), but ultimately we determined we could implement it ourselves with only the features we require, minimal code, and more granularity with our existing authorization / authentication procedures. \ No newline at end of file +This was heavily inspired by the great project that is [anubis], but ultimately we determined we could implement it ourselves with only the features we require, minimal code, and more granularity with our existing authorization / authentication procedures. + +The GoToSocial implementation of this scraper deterrence is still incredibly minimal, so if you're looking for more features or fine-grained control over your deterrence measures then by all means keep ours disabled and stand-up a service like [anubis] in front of your instance! + +[anubis]: https://github.com/TecharoHQ/anubis \ No newline at end of file