If a person/ company is determined enough, there is probably no way to prevent scraping of public data. But it's unlikely that there's AI company that is so much determined to scrape and use just my specific content. After all, I'm just a small, rather anonymous content creator. There's a chance, that AI bot will come to my site and stop at checking robots.txt. However, when the data is stored in public relay, that's different story. I have no control over it and it's just included in big dataset, freely available.