The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#1
|
|||
|
|||
![]()
Hi all,
I've been using a small file, robots.txt to prevent spiders from searching my webpage (costs too much bandwith). Now, it has been working quite well, until recently I saw a spider called "e-collector" was still spidering my webpage. How do I stop it from doing that? This is the robots.txt file I used up till now: =================== User-agent: * Disallow: / |
#2
|
||||
|
||||
![]()
robots.txt will only work if the spider in question actually bothers to read it, not all do.
|
#3
|
|||
|
|||
![]()
any idea about how I could block it without a robots.txt file?
|
![]() |
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|