The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#1
|
|||
|
|||
no index by google
I'm searching some ways for NO index my forum (Google, Bing, Yahoo...).
I put hidden all subforum. Any ideas? |
#2
|
||||
|
||||
Spiders are going to archive everything. Even with a extensive robots.txt file. Even if it's just a login page.
|
#3
|
||||
|
||||
Of course, I suppose you could install Simon's "Ban Spiders by User Agent" and put all known spiders on the list..... A very large spider definition list is available at OzzModz and is updated frequently.
|
#4
|
||||
|
||||
Add this to your robots.txt
Code:
User-agent: * Disallow: / |
Благодарность от: | ||
Max Taxable |
#5
|
||||
|
||||
Right but... That's not a block or even a rule, it is a polite request a good many spiders just ignore. Kind of like gun laws - the criminals don't pay any attention to it.
|
#6
|
|||
|
|||
and NoIndex, NoFollow in robots.txt?
Ban Spiders by User Agent is not a good option, ozzy? |
#7
|
||||
|
||||
If you want to block them, use the mod and the list I suggested. If you want to ask them nicely and politely to please not index your site, use the robots.txt. But there's no harm at all in doing both.
|
#8
|
||||
|
||||
Ban Spiders by User Agent is fine, with the list Max linked you to, and also do the robots.txt
|
#9
|
||||
|
||||
Baidu is one of the worst offenders for ignoring robots.txt, one of 100s who do ignore it. That's just one major example.
If your goal is to NOT have your site indexed anywhere at all, you want to ban them and keep the list updated. |
#10
|
|||
|
|||
Aha ok ok, but if I dont install it and only hidden all subforums for unregistered... google show threads etc?:erm:
|
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|