![]() |
<div align="center">Another way of shutting out SLURP is by using the noindex meta-tag. Yahoo SLURP obeys this command in the document's head, and the code inserted in between the head tags of your document is
<META NAME=robots CONTENT=noindex> This snippet will ensure that that Yahoo SLURP does not index the document in the search engine database. Another useful command is the nofollow meta-tag. The code inserted is <META NAME=robots CONTENT=nofollow> This snippet ensures that the links on the page are not followed.</div> I found this on an SEO site. |
Quote:
|
Quote:
|
Quote:
|
I have 2500 slurp spiders online at any given time. It got way more than it used to after I increased the crawl delay to 10. This may be coincidence.
Yahoo does not use more bandwidth than Google though. Yahoo just needs more spiders/IP's. Very annoying if half of your online users are bots. |
I think they are really good for our website's rank, I don't understand why you want to have less of them :s
|
From my forum:
Quote:
|
Could you tell us how did you get so many spiders in your web? :p
|
All times are GMT. The time now is 01:34 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|