• 1 Post
  • 9 Comments
Joined 2 years ago
cake
Cake day: July 20th, 2023

help-circle


  • Fair point. I do agree with the “clic to execute challenge” approach.

    For the terminal browser, it has more to do with it not respecting web standard than Anubis not working on it.

    As for old hardware, I do agree that a temporization could be good idea, if it wasn’t so easy to circumvent. In such case bots would just wait in the background and resume once the timer is fullified, which would vastly decrease Anubis effectiveness as they don’t uses much power to do so. There isn’t really much that can be done here.

    As for the CUDA solution, that will depend on the implemented hash algorithm. Some of them (like the one used by Monero) are made to vastly more inefficient on GPU than it is on the CPU. Moreover, GPU servers are far more expensive to run than CPU ones, so the result would be the same : crawling would be more expensive.

    In any case, the best solution would be by far to make it a legal requirement to respect robot.txt, but for now the legislators prefer to look the other way.


  • To solve it or not do not change that they have to use more resources for crawling, which is the objective here. And by contrast, the website sees a lot less load compared to before the use of Anubis. In any case, I see it as a win.

    But despite that, it has its detractors, like any solution that becomes popular.

    But let’s be honest, what are the arguments against it?
    It takes a bit longer to access for the first time? Sure, but that’s not like you have to click anything or write anything.
    It executes foreign code on your machine? Literally 90% of the web does these days. Just disable JavaScript to see how many website is still functional. I’d be surprised if even a handful does.

    The only people having any advantages at not having Anubis are web crawler, be it ai bots, indexing bots, or script kiddies trying to find a vulnerable target.






  • Same goes the other way. It’s not because it doesn’t work for you that it should go away.

    That technology has its uses, and Cloudflare is probably aware that there are still some false positive, and probably is working on it as we write.

    The decision is for the website owner to take, taking into consideration the advantages of filtering out a majority of bots and the disadvantages of loosing some legitimate traffic because of false positives. If you get Cloudflare challenge, chances are that he chosed that the former vastly outclass the later.

    Now there are some self-hosted alternatives, like Anubis, but business clients prefer SaaS like Cloudflare to having to maintain their own software. Once again it is their choices and liberty to do so.