
09-07-2007, 04:19 PM
|
|
|
Join Date: Apr 2007
Location: baghdad
Posts: 443
Благодарил(а): 0 раз(а)
Поблагодарили:
0 раз(а) в 0 сообщениях
|
|
Quote:
Originally Posted by iojam
|
thanks dear
but plz help me
he said that
Quote:
I had modified my .htaccess file to solve canonicalization problems by redirecting :
redundant URLs to new URLs
non-www URLs to www URLs
index.php to root
I theorize that since these redirections involved hundreds of URLs it?s possible that when I deployed the changes in my .htaccess file in mid July, it triggered the ?increase? in concurrent connections as the bots were redirected to the correct pages. In other words Googlebot attempted to make 2 connections for every page - once to the old/non-www URL and then to the new/www URL. As the concurrent connections increased, it triggered the automated mechanism that blocked Googlebot?s IP address. This in turn caused more time-out errors. The spikes in the Googlebot Download Time chart (above) indicates long download times which eventually ended in timeouts. Unfortunately, this affected one of the most important files - the robots. txt file - which every bot needs to before it accesses a site?s pages. These time-outs also made my sitemap inaccessible, so since Googlebot could not access these 2 important pages, it could not confirm my site still existed!
|
what should i do??  
|