![]() |
Can this thread be moved to the Full Releases forum?
|
I have a slight problem with googlebots, and that is they storm my forum by huge numbers. Currently, for example, I have 7 googlebots crawling my forum. That seems purely excessive to me, and I would like to somehow limit the amount of googlebots to maybe 2.
What is the command line for robots.txt to do this? Or maybe there is some other alternate method. Thanks ;) |
Quote:
|
Drat.. :ermm:
Wish it were possible somehow, oh well. My current bandwidth is being consumed quicly by these googlebots, so I guess I'll simply have to restrict them from the threads. |
Quote:
|
We have two different domains, but only one MySQL-database. Is it possible to place the robots.php on both the domains (and thus using the same tables)?
- djr |
Already found it. Just rename the robots_log table to robots_log_domain1 and create another one with _domain2 and update changes in robots.php.
- djr |
Installed, works great!
|
Glad that you like it. :cool:
Any suggestions? :) |
The only suggestion I can think of is being able to import your current robots.txt
I had disallowed "turnitin" and would like to be able to still block them. |
All times are GMT. The time now is 04:30 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|