The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#1
|
|||
|
|||
using robots.txt to prevent duplicate content
I want to use robots.txt to keep search engine spiders out of my whole forum...except for the archive.
Is this doable...and has anyone done it? I don't even need to allow the spiders onto the main index.php forum homepage, just have links into the archive from my main site. I want to do this to prevent some of the duplicate content issues that vB currently has with multiple url's leading to same pages (well documented problems) So, question is...is this a quick and easy fix, are there any potential problems (would only allow Mediapartners in for Adsense reasons) and what would I need to put in my robots.txt Many thanks in advance..... |
Thread Tools | |
Display Modes | |
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|