I realize my options are limited, but what about any robots.txt style steps? Thanks for any suggestions.
You must log in or register to comment.
A robots txt with a bunch of scraper user agents is a start, but there’s no guarantee they honor it. A firewall with some dynamic lists of known scanners and scrapers is a bit more forceful method.
Build, or have someone build for you, an ai Tarpit. https://arstechnica.com/tech-policy/2025/01/ai-haters-build-tarpits-to-trap-and-trick-ai-scrapers-that-ignore-robots-txt/
