![]() |
How to block crawl bots without blocking users?
I need to block google & co from searching particular forums on my site, whilst still allowing access to guests & members.
I know this can be done via the robots.txt file - I'm just wondering what to put there. Lets say the particular forum I want to block is: www.domain.com/forums/audi-versus-bmw/ (Note: I need to disallow all threads/posts in that particular forum). So, is the following correct? User-agent: * Disallow: /forums/audi-versus-bmw/ Would the above disallow also include blocking of the actual threads & posts in that forum? Or would it only block the actual url www.domain.com/forums/audi-versus-bmw? Thanks. |
Quote:
If you want human guests to see something while not logged in, then google and every other spider can see it as well. |
Quote:
Then is it possible to do this via .htaccess? |
|
All times are GMT. The time now is 04:23 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|