mirror of
https://github.com/superseriousbusiness/gotosocial.git
synced 2025-10-30 00:22:25 -05:00
add advanced-scraper-deterrence to example config
This commit is contained in:
parent
b5b889b2c1
commit
04c6016cf1
1 changed files with 19 additions and 0 deletions
|
|
@ -1261,3 +1261,22 @@ advanced-csp-extra-uris: []
|
||||||
# Options: ["block", "allow", ""]
|
# Options: ["block", "allow", ""]
|
||||||
# Default: ""
|
# Default: ""
|
||||||
advanced-header-filter-mode: ""
|
advanced-header-filter-mode: ""
|
||||||
|
|
||||||
|
# Bool. Enables a proof-of-work based deterrence against scrapers
|
||||||
|
# on profile and status web pages. This will generate a unique but
|
||||||
|
# deterministic challenge for each HTTP client to complete before
|
||||||
|
# accessing the above mentioned endpoints, on success being given
|
||||||
|
# a cookie that permits challenge-less access within a 1hr window.
|
||||||
|
#
|
||||||
|
# The outcome of this is that it should make scraping of these
|
||||||
|
# endpoints economically unfeasible, while having a negligible
|
||||||
|
# performance impact on your own instance.
|
||||||
|
#
|
||||||
|
# The downside is that it requires javascript to be enabled.
|
||||||
|
#
|
||||||
|
# For more details please check the documentation at:
|
||||||
|
# https://docs.gotosocial.org/en/latest/admin/scraper_deterrence
|
||||||
|
#
|
||||||
|
# Options: [true, false]
|
||||||
|
# Default: true
|
||||||
|
advanced-scraper-deterrence: false
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue