It's not easy to prevent people from scraping your site.
IP's can be changed/proxies can be used and headers can be spoofed (thus making methods to detect the user-agent useless).
There may be one way to stop scrapers, that is to add a JavaScript check to your site before people are able to view your site. CloudFlare does this to prevent certain DDoS attacks. However people could simply just go to your site in a normal browser and save each file individually to their desktop.
|