Quote:
Originally Posted by BadgerDog
Do you have a link?
I can't seem to find the right page....
Thanks ..
Regards,
Doug
|
Yes, I ensured that the following was in my robots.txt file:
User-agent: Baiduspider
Disallow: /
Then I sent an email to:
spiderhelp@baidu.com
Here is the message and reply I received:
Quote:
Dear,
Thank you for your email.
We have updated our DNS record to make our spider behave the way requested in your robots file.
Should you need further assistance, please do not hesitate to contact us.
Best Regards,
Stephy Wu
Baidu Spider Team
________________________________________
re: Continuous Crawling of my site
To whom it may concern;
I have been trying for a month now to halt all crawling of my site by Baidu. I have added the following code to my robots.txt file:
User-agent: Baiduspider
Disallow: /
This was done 3 weeks ago. However I am being crawled daily.
Baidu is daily eating up a ton of Server Resources, and costing me slow load times. I also employed a spider ban modification, and have banned more than 28,000 Baidu spider entries in 3 weeks.
This is ridiculous. I am asking you to immediately halt all crawling of my site by Baidu.
|
I have not seen hide nor hair of Baidu since this was done, nearly a month ago.
To find the email address I went to their website, translated the page into English, and the searched Baidu Spider. Which took me to a search results page, which lead me to this page:
http://www.baidu.com/search/spider.html
I simply translated to English, and found the info I was looking for.
Baidu was the ONLY spider that was causing major issues, now I am able to use this add-on for other spiders - but Baidu was using massive amounts of resources.
Hope this helps.