Quote:
Originally Posted by ATVTorture
|
Blocking robots or even delaying them is a really piss poor work around for what shouldn't be a problem. You're only going to hurt your results in the search engines by doing this but if you aren't concerned with such trickery then its a non-issue at that point.
Code:
# Allow Archiver
User-agent: ia_archiver
Allow: /
User-agent: Slurp
Crawl-delay: 60
User-agent: GoogleBot
Crawl-delay: 60
User-agent: *
Disallow: *.php
Disallow: *.js
Disallow: *.jsp
Disallow: *.cfm
Disallow: *.asp
Disallow: *.html
Disallow: *.htm
Disallow: *.aspx
Disallow: *.cgi
Disallow: /forum/includes/
Disallow: /forum/install/
Disallow: /forum/customavatars/
Disallow: /forum/archive/
Disallow: /forum/sitemap/
Disallow: /forum/members/
Disallow: /wp-includes/
Disallow: /wp-content/
Disallow: /wp-admin/
Disallow: /forum/images
And just out of curiosity what hosting company are you using