![]() |
Google - robots.txt
So I've just FINALLY gotten around to doing this. My question is, what files should I restrict from the Google Crawl?
|
Restrict those that you don't want spidered. I think there are a few threads here where users have posted their robots.txt files. Just do a search and you should find them.
|
All times are GMT. The time now is 03:51 AM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|