Log in

View Full Version : using robots.txt to prevent duplicate content


esllou
03-03-2005, 11:25 AM
I want to use robots.txt to keep search engine spiders out of my whole forum...except for the archive.

Is this doable...and has anyone done it? I don't even need to allow the spiders onto the main index.php forum homepage, just have links into the archive from my main site.

I want to do this to prevent some of the duplicate content issues that vB currently has with multiple url's leading to same pages (well documented problems)

So, question is...is this a quick and easy fix, are there any potential problems (would only allow Mediapartners in for Adsense reasons) and what would I need to put in my robots.txt

Many thanks in advance.....