I agree that spiders are a problem. I am having trouble with my host for using too much resource because of too many persistant DB connections. I always have lots of spiders.
I don't want to dissallow them in robots.txt as they are good. But, I must cut back on DB connections and I don't want to have 30 Yahoo spiders all at once while members can't get on.
IS THERE ANYTHING that will limit Slurps to ten at a time anyway. ??
Thanks
Steve
|