![]() |
blocking spiders
Hi all,
I've been using a small file, robots.txt to prevent spiders from searching my webpage (costs too much bandwith). Now, it has been working quite well, until recently I saw a spider called "e-collector" was still spidering my webpage. How do I stop it from doing that? This is the robots.txt file I used up till now: =================== User-agent: * Disallow: / |
robots.txt will only work if the spider in question actually bothers to read it, not all do.
|
any idea about how I could block it without a robots.txt file?
|
All times are GMT. The time now is 01:04 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|