Cloudflare Web Analytics
Posted: Mon May 05, 2025 1:15 am
What do the people of this forum think about Cloudflare Web Analytics?
A community forum to reject convenience
https://www.rejectconvenience.com/forum/
Anubis isnt really a Captcha System, it only makes it harder and therefor more expensive to visit your Website. Harder meaning in Resources not in actual effort on the User Side. It's more there to prevent continues Scrapes of your pages. By making the Computer that is scrapping you do some hard Math.Mæstro wrote: Sat Jun 21, 2025 3:55 pm As an alternative, I think the best examples of captchas for ensuring all and only natural persons may access the site are those which appear on DuckDuckGo and Xcancel when JavaScript and cookies are disabled. The first requires selecting three pictures of ducks from a grid of nine. (I will update this post with an example when the site next demands it.) The other resembles a traditional site captcha in reading numbers from an image, where the numbers are a one-time password generated by means unknown to me. The only fault in them which I could imagine is that I do not know how they would treat the blind. Many sites have also lately implemented Anubis. While it still requires scripts and cookies, it is, at least, far less intrusive than Cloudflare.
Cookies don't determine if smth is a Captcha. Anubis blocks Connections that couldn't execute the Proof of Work or which don't have Cookie Capabilities. Regardless of it being a Bot or a Human. But that doesn't make it a Captcha. That is only a Feature Requirement. A Captcha does require a form of Human Interaction, like a Text you have to type in or as simple as Clicking a Check Mark. The goal of a Captcha is to determine if smth is a Bot and, if set to do so, block it.Mæstro wrote: Sun Jun 22, 2025 2:14 pm Because I disable cookies by default, I can attest that Anubis denies rejected users access and is therefore a proper captcha, even if it requires no user input. It does not merely punish scrapers the way that an email server punishes spammers by tolling the client’s processor whenever a message is sent.
Verb sap. The full argument is as follows: Anubis blocks traffic to those whom it denies, its behaviour towards cookieless browsers serving as evidence of this.
This is also not the case with the Proof or Work test that Anubis Presents, as Computers are the only ones that can preform the Calculations in a timely manner.Mæstro wrote: Mon Jun 23, 2025 10:21 am ‘a program that protects websites against bots by generating and grading tests that humans can pass but current computer programs cannot.’
I think this is false, for Anubis itself and its official documentation state otherwise. Anubis’ official instance introduces itself to users as ‘making sure you’re not a bot’, as shown in the attached screenshot. The readme describes Anubis as a firewall and an alternative to Cloudflare, an unambiguous captcha (where ticking a box is the challenge), which is why I had brought it up in the first place. The developer’s explanation states that Anubis is intended to block scrapers, although it does so by testing features.Crazyroostereye wrote: Mon Jun 23, 2025 11:23 am Anubis doesn't [determine if a Connection is from a Human or a Bot], nor it tries to determine if a Connection is Human or Bot.
This is literally true, and when I spoke of vagueness in my post, this is what I had in mind. Anubis says on the official instance if cookies are blocked that it ‘requires cookies [for] making sure you are a valid client’. ‘Client’ can refer just as well to the end user himself or the computer which he uses to access the server. The user, not the computer, is reading the message, but Anubis tests the computer itself. None of the definitions I have found are pedantic enough to bother splitting this hair, so neither do I.]Computers are the only ones that can preform the Calculations in a timely manner.
It is true, from the the developer’s comments, that Anubis uses the same technology, that also in Hashcash, which is used (among other things) for punishing spambots, but from the above documentation, Anubis infers from the results whether to deny bot networks access. The readme also mentions that legitimate web crawlers will fail to index sites wielding Anubis, which would not happen if it only tolled the user’s processor like some antispam measures.The way Anubis works is by making Hard to Compute calculation that the Computer has to preform, which its intent is to increase the Computing bill of Bot Networks that access the Website frequently.