How to block crawl bots without blocking users?
I need to block google & co from searching particular forums on my site, whilst still allowing access to guests & members.
I know this can be done via the robots.txt file - I'm just wondering what to put there.
Lets say the particular forum I want to block is: www.domain.com/forums/audi-versus-bmw/ (Note: I need to disallow all threads/posts in that particular forum).
So, is the following correct?
User-agent: *
Disallow: /forums/audi-versus-bmw/
Would the above disallow also include blocking of the actual threads & posts in that forum? Or would it only block the actual url www.domain.com/forums/audi-versus-bmw?
Thanks.
|