![]() |
Yahoo! Slurp Spider, Sucking Us Dry
I know that spiders are generally good, and about the robot.txt, except I don't want to stop spiders, just have less of them. There is not a single time where we have less than 15 Yahoo! Slurp Spiders on our site. Is there somewhere I can report this? I can understand having 1-3 on once every now and then, but theses guys have been here for more than 2 years!!!
Code:
Yahoo! Slurp Spider Viewing Thread Please Help Retell :pirate: |
Why don't you want them ?
Most people would kill to have so much search engine attention .... |
Yes spiders are a good thing, I would let them "slurp" away. :)
|
I've yet to not see Yahoo Spidering my main site since I reopened it a year ago.
|
WEll about once a month we somehow end up with 95 slurps... I think that is DEF pushing it... Hehe sorry for stealing all the slurps from you guys :D
|
At the moment:
1x Adsense 1x MSN 1x Gigablast (? Don't know that) 8x Google 23x Yahoo! ... less than average. |
Pardon the old bump, but I'm getting slurped out myself. I had 81 of them on my site this morning.
|
Yahoo Slurp! (and a few others) generally make a mass invasion around once a month so don't take that to be the norm unless it really is the norm. Otherwise it's a common complaint that Yahoo Slurp! drains more resources than most spiders and there's really nothing you can do aside from blocking them entirely.
|
Check our WOL - atm we have around 750 Yahoo spiders online !
|
I thinks that Yahoo SLurp! is stupid crawler, because he don't follow with rules in robots.txt. For example: I was restrict access for all crawlers for few files:
Code:
Disallow: /attachment.php Also, few days ago at my board was online ~125 Yahoo Slurp! crawlers (each 2 seconds come new Yahoo Slurp! crawler) - I think it was tempopary bug with crawler :) |
I'm trying to get some slurps, come to pappa!! Added my forums to a bunch of search engines.. just wondering how long it will take. :D
|
Quote:
On my own forum (a different forum), which is about 5 months old, I've noticed that each time I add new content (cellphone wallpapers) I get hit with anywhere from 2 - 5 of them within a few mintutes of the new upload, almost without fail and they go directly to the section of the forum where I added the new content. If I get some spare Sluprs I'll send 'em your way :). |
Send some of those guys my way :cool:
|
Gaminggutter.com has atleast 40-50 spiders on at once. :P
|
You think 15 spiders online is bad? Slurp has had about 500 spiders on my site since yesterday; they're still there.. around double the normal amount!
|
User-agent: Slurp
Crawl-delay: 60 Has solved all of my woes. :) I'm not sure what's been up in the last week, but with any other CD I've had hundreds of them online. I've no idea WHY that works, or if it will work for someone else, but it's working for me. (knock on wood) Edit: It does still ignore my disallows though. :\ On the upside, I had 13,947 indexed pages in Yahoo two weeks ago, and currently I have 48,000, albeit after a LOT of SEO work, but it does seem like they're updating their content a bit more rigorously. |
I agree that spiders are a problem. I am having trouble with my host for using too much resource because of too many persistant DB connections. I always have lots of spiders.
I don't want to dissallow them in robots.txt as they are good. But, I must cut back on DB connections and I don't want to have 30 Yahoo spiders all at once while members can't get on. IS THERE ANYTHING that will limit Slurps to ten at a time anyway. ?? Thanks Steve |
Quote:
|
Quote:
I'll try that, thanks. Higher number should equal fewer spiders ? Steve |
I once had about 800 yaho0 slurpp spiders at one time, I was panicking, thought it was a Ddos.
|
<div align="center">Another way of shutting out SLURP is by using the noindex meta-tag. Yahoo SLURP obeys this command in the document's head, and the code inserted in between the head tags of your document is
<META NAME=robots CONTENT=noindex> This snippet will ensure that that Yahoo SLURP does not index the document in the search engine database. Another useful command is the nofollow meta-tag. The code inserted is <META NAME=robots CONTENT=nofollow> This snippet ensures that the links on the page are not followed.</div> I found this on an SEO site. |
Quote:
|
Quote:
|
Quote:
|
I have 2500 slurp spiders online at any given time. It got way more than it used to after I increased the crawl delay to 10. This may be coincidence.
Yahoo does not use more bandwidth than Google though. Yahoo just needs more spiders/IP's. Very annoying if half of your online users are bots. |
I think they are really good for our website's rank, I don't understand why you want to have less of them :s
|
From my forum:
Quote:
|
Could you tell us how did you get so many spiders in your web? :p
|
All times are GMT. The time now is 10:29 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|