mirror of
https://github.com/superseriousbusiness/gotosocial.git
synced 2025-10-28 06:22:26 -05:00
[chore] remove nollamas middleware for now (after discussions with a security advisor) (#4433)
i'll keep this on a separate branch for now while i experiment with other possible alternatives, but for now both our hacky implementation especially, and more popular ones (like anubis) aren't looking too great on the deterrent front: https://github.com/eternal-flame-AD/pow-buster Co-authored-by: tobi <tobi.smethurst@protonmail.com> Reviewed-on: https://codeberg.org/superseriousbusiness/gotosocial/pulls/4433 Co-authored-by: kim <grufwub@gmail.com> Co-committed-by: kim <grufwub@gmail.com>
This commit is contained in:
parent
247733aef4
commit
6801ce299a
28 changed files with 207 additions and 1395 deletions
|
|
@ -1338,40 +1338,3 @@ advanced-csp-extra-uris: []
|
|||
# Options: ["block", "allow", ""]
|
||||
# Default: ""
|
||||
advanced-header-filter-mode: ""
|
||||
|
||||
# Bool. Enables a proof-of-work based deterrence against scrapers
|
||||
# on profile and status web pages. This will generate a unique but
|
||||
# deterministic challenge for each HTTP client to complete before
|
||||
# accessing the above mentioned endpoints, on success being given
|
||||
# a cookie that permits challenge-less access within a 1hr window.
|
||||
#
|
||||
# The outcome of this is that it should make scraping of these
|
||||
# endpoints economically unfeasible, while having a negligible
|
||||
# performance impact on your own instance.
|
||||
#
|
||||
# The downside is that it requires javascript to be enabled.
|
||||
#
|
||||
# For more details please check the documentation at:
|
||||
# https://docs.gotosocial.org/en/latest/advanced/scraper_deterrence
|
||||
#
|
||||
# Options: [true, false]
|
||||
# Default: true
|
||||
advanced-scraper-deterrence-enabled: false
|
||||
|
||||
# Uint. Allows tweaking the difficulty of the proof-of-work algorithm
|
||||
# used in the scraper deterrence. This determines roughly how many hash
|
||||
# encode rounds we require the client to complete to find a solution.
|
||||
# Higher values will take longer to find solutions for, and vice-versa.
|
||||
#
|
||||
# The downside is that if your deterrence takes too long to solve,
|
||||
# it may deter some users from viewing your web status / profile page.
|
||||
# And conversely, the longer it takes for a solution to be found, the
|
||||
# more you'll be incurring increased CPU usage for scrapers, and possibly
|
||||
# even cause their operation to time out before completion.
|
||||
#
|
||||
# For more details please check the documentation at:
|
||||
# https://docs.gotosocial.org/en/latest/advanced/scraper_deterrence
|
||||
#
|
||||
# Examples: [50000, 100000, 500000]
|
||||
# Default: 100000
|
||||
advanced-scraper-deterrence-difficulty: 100000
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue