In the ban spiders mod you need to shorten the useragent string to catch more, if you are using my mod then it looks for the entire string that you entered in the list, if you shorten it to say amazonaws it will ban every bot with that in its string.
If you want to ban them via robots.txt you can use this:
Quote:
Originally Posted by Simon Lloyd
Order allow,deny
SetEnvIf Request_URI ^/robots\.txt$ allowall
deny from 23.20.0.0/14 46.51.128.0/17 46.137.0.0/16 50.16.0.0/14 50.112.0.0/16 52.0.0.0/11 54.64.0.0/15 54.66.0.0/16 54.72.0.0/13 54.80.0.0/12 54.144.0.0/12 54.160.0.0/11 54.192.0.0/10 67.202.0.0/18 72.21.192.0/19 72.44.32.0/19 75.101.128.0/17 79.125.0.0/18 87.238.80.0/21 87.238.84.0/23 103.4.8.0/21 107.20.0.0/14 122.248.192.0/18 156.154.64.0/22 156.154.68.0/23 174.129.0.0/16 175.41.128.0/18 175.41.192.0/18 175.41.224.0/19 176.32.64.0/19 176.34.0.0/16 178.236.0.0/20 184.72.0.0/15 184.169.128.0/17 185.48.120.0/22 204.236.128.0/17 216.182.224.0/20
allow from env=allowall
|