The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#21
|
|||
|
|||
<div align="center">Another way of shutting out SLURP is by using the noindex meta-tag. Yahoo SLURP obeys this command in the document's head, and the code inserted in between the head tags of your document is
<META NAME=robots CONTENT=noindex> This snippet will ensure that that Yahoo SLURP does not index the document in the search engine database. Another useful command is the nofollow meta-tag. The code inserted is <META NAME=robots CONTENT=nofollow> This snippet ensures that the links on the page are not followed.</div> I found this on an SEO site. |
#22
|
||||
|
||||
Quote:
|
#23
|
||||
|
||||
No, you put it in your robots.txt file. You might want to search the site for more info on this.
|
#24
|
||||
|
||||
Ohh ok, thanks!
|
#25
|
||||
|
||||
I have 2500 slurp spiders online at any given time. It got way more than it used to after I increased the crawl delay to 10. This may be coincidence.
Yahoo does not use more bandwidth than Google though. Yahoo just needs more spiders/IP's. Very annoying if half of your online users are bots. |
#26
|
|||
|
|||
I think they are really good for our website's rank, I don't understand why you want to have less of them :s
|
#27
|
|||
|
|||
From my forum:
Quote:
|
#28
|
|||
|
|||
Could you tell us how did you get so many spiders in your web?
|
Thread Tools | |
Display Modes | |
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|