Replies: 3 comments 1 reply
-
I don't think it's technically possible to do this without cookies. Cookies are how the web shares state. |
Beta Was this translation helpful? Give feedback.
-
How about temporarily allowing all access via an IP-address, from which a successful Proof-Of-Work was received, to bypass Anubis? For example: Allowing computers using a specific IP-address to bypass Anubis for 48 hours, 1000 requests, or until 100 MB of data was transferred, and then requiring another Proof-Of-Work for the next request. Maybe further limited to a specific set of resources, so that, after being allowed to bypass Anubis for |
Beta Was this translation helpful? Give feedback.
-
I don't think that making this work without cookies is in scope. This is intentional to make it more difficult for current industrial implementations of scraping engines to handle Anubis. Based on insider knowledge, they either share cookies between all session they are scraping with or don't save the cookies at all. This means that Anubis will correctly challenge them at every point. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Perhaps paired with the no-JS implementation, a no-cookie option for progressing would be nice. I didn't whitelist cookies with uMatrix on xeiaso.net and got stuck in a refresh loop.
Beta Was this translation helpful? Give feedback.
All reactions