The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#1
|
|||
|
|||
![]()
I am running my vB forum in a subdomain. I added robots.txt to both my root and subdomain folder, but I can't seem to stop slurp (inktomi) from trying to access all the wrong pages.
in my subdomain (http://forum.mydomain.com/robots.txt) , I didn't know if I needed the preceding / so I have Quote:
in my root (http://www.mydomain.com/robots.txt) Just in case, I also I have Quote:
|
#2
|
|||
|
|||
![]()
I could sure use some help with this. My ISP is going to force me to change service contracts over this issue. I have like 80 spider users showing at any given time and 60 of them are *.inktomisearch.com
|
#3
|
|||
|
|||
![]() |
#4
|
|||
|
|||
![]()
If I'm right it can take a month before your robots.txt file will have effect in stopping them (read that somewhere).
I think the best option is to use a .htaccess file to deny them access to the server |
#5
|
||||
|
||||
![]() Quote:
|
#6
|
|||
|
|||
![]() Quote:
|
#7
|
||||
|
||||
![]()
can any one help me....to generate a nice robots.txt , that stop known bad robots & image.
we are getting Google,Yahoo!Slurp,MsnBot --mostly.. but i really dont know a lot abt robots...which is bad or which is good.... can any one help me ?/ |
![]() |
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|