Quote:
Originally posted by Erwin
You can use robots.txt to block Google or any other spiders spidering specific sections of your site or certain files like .gif etc.
|
Funny this was mentioned, because when looking into Filburts tutorial for creating friendly URL's, I also came across a thread by MarkB questioning how to stop webcrawlers from consuming so much bandwidth.
Take a look at the following thread for more solutions, in regard to also using robots.txt:
http://www.vbulletin.com/forum/showt...threadid=44966
I installed Filburts hack as well today, and it was incredibly easy to set up compared to the troubles I had with Fastforwards hack. If anyone is interested in Filburts, here is the thread:
http://www.vbulletin.com/forum/showt...threadid=56783