Replies: 2 comments
-
Hey, I understand your frustration as I have had to deal with a lot of security software rejecting me too. I'm sorry this has happened but I want to take a moment to clarify what Anubis is, why it exists, and why you may be encountering a false rejection. You are seeing a message related to Anubis because the administrator of the website you're visiting has chosen to install it. Anubis is a tool designed to help website owners protect their services from being overwhelmed by aggressive, automated traffic, such as the kind used by some AI companies to scrape data. This type of high-volume traffic can cause servers to crash, which makes the website inaccessible for everyone. If the website is down (or sometimes down due to aggressive scraping traffic), nobody can learn from it. Anubis is a compromise. This project is not meant to punish users. It is meant to protect websites from the massive hordes of AI scrapes that have been going around and destroying open source communities, libraries, and other tools we all rely on. This is a defensive measure and it's currently quite paranoid while I am working on refining the logic. This project is under active development and I am continuously working on fingerprinting, behaviour heuristics, and other methods to reduce false positives. That being said, no software is perfect. Software has bugs. Complicated software has complicated bugs. If you think you are having Anubis block you when you shouldn't be blocked, please contact the administrators of the service that set up Anubis for help. Anubis has some built-in heuristics, but the administrator manages the configuration and deployment of the service. They have access to their server logs and can determine why you were blocked (or issued a challenge) and adjust their configuration accordingly. This project is a labour of love and I want it to be better than it is. That is why I have been working on making it better. I hope this helps give context into what Anubis is and why it exists. Be well, Xe |
Beta Was this translation helpful? Give feedback.
-
I think it's mostly the logic of selfish independence. If all these artificial intelligence robots could find a way to pool their Internet exploration, so that a single site exploration could feed as many robots as possible simultaneously, it seems to me that the problem would be considerably lessened. In any case, what I see is that most of these robots not only don't respect the robots.txt file, but also don't identify themselves via the "User-Agent" string. On the contrary, they try as hard as possible to pass themselves off as regular visitors, so that we can't even block them based on the "User-Agent" string. On this score alone, at least these robots (who seem to be outnumbering those who don't) deserve no consideration, and I will have no qualms about preventing them from visiting my websites, nor about helping others do the same. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
help me understand this, you are punishing the users because of the IA? how is this logic supposed to work, i cant access the websites because your project, why i need to pay the price for your stupidity and moronic project ? explain how this do anyone human any good? How punishing the human do any service to the other humans?
Beta Was this translation helpful? Give feedback.
All reactions