Stopping AI scrapers from taking down my server
(ellis.link)
In my earlier post about Forgejo and scraper bots, I mentioned that adding a robots.txt file helped reduce scraping. Well, it turns out the bots only disappeared because I’d set the repository to private, and they were receiving numerous 404 errors. They weren’t actually respecting robots.txt (although Google, at least, was).
In my earlier post about Forgejo and scraper bots, I mentioned that adding a robots.txt file helped reduce scraping. Well, it turns out the bots only disappeared because I’d set the repository to private, and they were receiving numerous 404 errors. They weren’t actually respecting robots.txt (although Google, at least, was).