This is already going offtopic, but...
Hu wrote:
I think the least expensive Anubis I've seen was 8 seconds, which is 7.9 seconds too long,
Some sites that don't have anubis takes longer for me to load. Must be nice if every site you visit loads in 0.1 sec in 2026.
Hu wrote:
As regards flexibeast's question, I don't have an answer, but I will note that most of the JS-based paths seem to be premised on the idea that if the JavaScript is "sufficiently complex", then scrapers won't be able to emulate it, and will just fail out the same way that NoScript users fail. I think that's a short-sighted design, since it's readily defeated by the scrapers using a more capable browsing client to more faithfully emulate a full browser. For years, there have been setups for automated testing of websites (by the legitimate test organization), and an automated scraper browser is essentially the same concept. Start a virtual display, connect a browser to it, and have the browser pre-loaded with an automation extension (such as Selenium) and use that to drive it around the web, automatically completing any automated access test that requires a "real browser." It's an arms race that mainly hurts the people who insist on having any modicum of control over what runs locally, and merely inconveniences the scrapers for a short time.
and that's what the above linked iocaine is trying to do.
https://iocaine.madhouse-project.org/ (site works without js, for the curious ones)
I understand your frustration, essentially being blocked from accessing some sites due to your ethics and configuration, but at the same time, codeberg has been
very vocal about bots/AI scrapers hammering their site. So I get their view as well, something needs to be done to block most of those kind of bots. This is the technology we currently have.
Personally I wouldn't mind moving to gemini side of web, since the modern www experience is kind of awful.