AI web-crawling bots are increasingly targeting free and open source (FOSS) projects, causing disruptions to websites hosting these projects. Developers believe these bots ignore the Robots Exclusion Protocol, a tool designed to prevent crawling.
Niccolò Venerandi, developer of Plasma Linux desktop and owner of LibreNews blog, notes that FOSS developers are “disproportionately” impacted by these attacks. The issue lies with AI bots not honoring robot.txt files, which tells bots what not to crawl.
To combat this, Xe Iaso created Anubis, a reverse proxy proof-of-work check tool that must be passed for requests to hit a Git server. Bots are blocked while humans can access websites. The name “Anubis” is inspired by the Egyptian god who weighs souls.
The instant popularity of Anubis shows widespread pain among FOSS developers. Several high-profile developers share similar experiences with hyper-aggressive LLM crawlers and DDoS outages. Some have taken drastic measures, including blocking entire countries from access.
As one developer suggested, using content that provides negative utility value to bots could be an effective way to deter them. Another tool, Nepenthes, traps crawlers in an endless maze of fake content. Cloudflare has also released a similar tool called AI Labyrinth.
Developer Drew DeVault urges developers to stop legitimizing AI technologies and requests their support for a more direct solution. Until then, cleverness and humor will be used to fight back against these “cockroaches of the internet.”
Source: https://techcrunch.com/2025/03/27/open-source-devs-are-fighting-ai-crawlers-with-cleverness-and-vengeance