View Full Version : robots.txt file
crazyboy1661
09-23-2013, 04:16 PM
Hi guys, here is the Robots.txt code given for those who wants it.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /go/
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /admincp/
Disallow: /modcp/
Disallow: /attachment.php
Disallow: /search.php
Disallow: /newreply.php
Disallow: /newthread.php
Disallow: /editpost.php
Disallow: /profile.php
Disallow: /register.php
Disallow: /login.php
Disallow: /subscription.php
Disallow: /private.php
Disallow: /report.php
Disallow: /sendmessage.php
Disallow: /member.php
Disallow: /memberlist.php
Disallow: /misc.php
Disallow: /moderator.php
Disallow: /postings.php
Disallow: /sendtofriend.php
Disallow: /threadrate.php
Disallow: /usercp.php
Disallow: /showgroups.php
Disallow: /announcement.php
Also refer these links
https://vborg.vbsupport.ru/showthread.php?p=2446551#post2446551
https://vborg.vbsupport.ru/showthread.php?t=302483
bzcomputers
09-24-2013, 12:06 AM
DO NOT place your admincp and modcp in the robots.txt file! This will only alert hackers that they exist in that exact position. Leave them out completely, this also goes if you have renamed them.
Robots can ignore your robots.txt file. Especially malware robots that scan the web for security vulnerabilities and email address harvesters used by spammers. The robots.txt file is a publicly available file. Anyone can see what sections of your site you don't want robots to got to, so don't go broadcasting directions to secure areas of your site that you don't want anyone to go to (either with good intentions or bad).
The purpose of robots.txt file is to inform the "good" robots of your site layout and what you do and don't want to be indexed.
Each person's robots.txt file should be a little different. Depending on how your forum was installed (in the root or not) and what add-ons and changes have been made since the initial install.
You should always include a reference to your sitemap. There are robots.txt "validators" that will not validate unless the sitemap is included within the file. Both Google and MSNbot (Bing / Yahoo) use validators and will look for a sitemap reference. Your sitemap needs to be the full url. Here is a sitemap reference example:
Sitemap: http://www.yoursite.com/sitemap.xml
Your sitemap name and location may be different. I recommend placing the sitemap in the first line of your robots.txt file.
The robots.txt file below is a little more in-depth than the first post above and will yield you better results.
This example that can be used for an initial forum install:
Sitemap: http://www.yoursite.com/sitemap_index.xml.gz
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: clientscript/
Disallow: cpstyles/
Disallow: customavatars/
Disallow: customgroupicons/
Disallow: customprofilepics/
Disallow: customsignaturepics/
Disallow: forumrunner/
Disallow: images/
Disallow: includes/
Disallow: install/
Disallow: members/
Disallow: mobiquo/
Disallow: sitemap/
Disallow: ajax.php
Disallow: attachment.php
Disallow: calendar.php
Disallow: cron.php
Disallow: editpost.php
Disallow: global.php
Disallow: image.php
Disallow: inlinemod.php
Disallow: joinrequests.php
Disallow: login.php
Disallow: member.php
Disallow: memberlist.php
Disallow: misc.php
Disallow: moderator.php
Disallow: newattachment.php
Disallow: newreply.php
Disallow: newthread.php
Disallow: online.php
Disallow: poll.php
Disallow: postings.php
Disallow: printthread.php
Disallow: private.php
Disallow: profile.php
Disallow: register.php
Disallow: report.php
Disallow: reputation.php
Disallow: search.php
Disallow: sendmessage.php
Disallow: subscription.php
Disallow: threadrate.php
Disallow: usercp.php
Disallow: usernote.php
This is a modified example of the above:
This example shows how a robots.txt could look for a forum install outside the root with some added mods, some additional image directory changes and the complete blocking of user-agent Baiduspider.
I recommend if you don't cater to an audience in China to block all Baiduspiders otherwise they will hammer your site hundreds of times a day. For additional details on Baiduspider: http://chineseseoshifu.com/blog/what-is-baiduspider.html.
There are other mods for vB available for blocking user-agents I recommend this one: https://vborg.vbsupport.ru/showthread.php?t=268208. In the meantime Baiduspider is the only user-agent I block that I know does not ignore the robots.txt block.
Sitemap: http://www.yoursite.com/sitemap_index.xml.gz
User-agent: Mediapartners-Google
Disallow:
User-agent: Baiduspider
Disallow: /
User-agent: *
Disallow: /doubleclick/
Disallow: /eyeblaster/
Disallow: /forum/archive/
Disallow: /forum/clientscript/
Disallow: /forum/cpstyles/
Disallow: /forum/customavatars/
Disallow: /forum/customgroupicons/
Disallow: /forum/customprofilepics/
Disallow: /forum/customsignaturepics/
Disallow: /forum/dbtech/
Disallow: /forum/forumrunner/
Disallow: /forum/images/
Disallow: /forum/includes/
Disallow: /forum/install/
Disallow: /forum/members/
Disallow: /forum/mobiquo/
Disallow: /forum/sitemap/
Disallow: /forum/vbseo/
Disallow: /forum/ajax.php
Disallow: /forum/attachment.php
Disallow: /forum/calendar.php
Disallow: /forum/cron.php
Disallow: /forum/editpost.php
Disallow: /forum/global.php
Disallow: /forum/image.php
Disallow: /forum/inlinemod.php
Disallow: /forum/joinrequests.php
Disallow: /forum/login.php
Disallow: /forum/member.php
Disallow: /forum/memberlist.php
Disallow: /forum/misc.php
Disallow: /forum/moderator.php
Disallow: /forum/newattachment.php
Disallow: /forum/newreply.php
Disallow: /forum/newthread.php
Disallow: /forum/online.php
Disallow: /forum/poll.php
Disallow: /forum/postings.php
Disallow: /forum/printthread.php
Disallow: /forum/private.php
Disallow: /forum/profile.php
Disallow: /forum/register.php
Disallow: /forum/report.php
Disallow: /forum/reputation.php
Disallow: /forum/search.php
Disallow: /forum/sendmessage.php
Disallow: /forum/subscription.php
Disallow: /forum/threadrate.php
Disallow: /forum/usercp.php
Disallow: /forum/usernote.php
Disallow: /javascript/
Disallow: /misc/
Disallow: /styles/
Disallow: /xcache/
crazyboy1661
09-28-2013, 06:42 AM
Hi bzcomputers, Now i got it. you have shared such a nice and very important information. really it is worth to read and follow.
thanks a lot.
--------------- Added 1380357369 at 1380357369 ---------------
As you have specified in your second modified code i.e Sitemap: http://www.yoursite.com/sitemap_index.xml.gz, I have added to the robots.txt file. But I pointed to that url and it is showing some error.
Error loading stylesheet: An unknown error has occurred (805303f4)http://telugudosti.com/vbseo_sitemap/sitemap.xsl
I don't found any extension called .xml.gz in the root directory.
I found something like the below one in the said path.
store_sitemap/vbulletin_sitemap_thread_0.xml.gz
so, i have added the same to my robots.txt. I don't know whether it is correct or not. I need your advice in this regard pls.
thanks
--------------- Added 1380357941 at 1380357941 ---------------
Sorry I am confused.
bzcomputers
10-01-2013, 12:45 PM
When using vbSEO Sitemap you must have some information included in your .htaccess file. This information will allow for keeping your vbSEO sitemap creation files secure while still allowing public access to the sitemap themselves.
If you already have an .htaccess file in your root you need to add this:
RewriteRule ^((urllist|sitemap_).*\.(xml|txt)(\.gz)?)$ vbseo_sitemap/vbseo_getsitemap.php?sitemap=$1 [L]
If you do not have an .htaccess file in your root:
RewriteEngine On
RewriteRule ^((urllist|sitemap_).*\.(xml|txt)(\.gz)?)$ vbseo_sitemap/vbseo_getsitemap.php?sitemap=$1 [L]...I also notice on your site that you do not force or remove a url prefix (the www.) (http://www.)). There are some benefits to forcing the prefix one way or the other, this includes a major SEO related one. Forcing a prefix (one way or the other) will stop search engines from splitting the weight of your urls. Right now your site is being indexed as both: http://telugudosti.com/ and http://www.telugudosti.com/ (http://telugudosti.com/), in turn every url beneath it also has the possibility of being indexed both with www. and without www. This can cause the weight of your urls to be split between to the two addresses causing their overall page ranking to be much lower. By forcing the prefix one way or the other will eliminate this and in short time your duplicate listings will be merged within the search engines creating a single listing for each page and in turn the much higher ranking they deserve.
Although there are many opinions on which way to go (with or without www.) (http://www.)). It is really up to you but you should definitely go one way or the other. To do this again you will be editing your .htaccess file by adding this:
To force with www.:
RewriteCond %{HTTP_HOST} ^telugdosti.com$RewriteRule (.*) http://www.telugdosti.com\/$1 [R=301]
To force without www.:
RewriteCond %{HTTP_HOST} ^www.telugdosti.com$
RewriteRule (.*) http://telugdosti.com\/$1 [R=301]
So if you didn't have an htaccess prior it would now look like:
RewriteEngine OnRewriteCond %{HTTP_HOST} ^telugdosti.com$
RewriteRule (.*) http://www.telugdosti.com\/$1 [R=301]
RewriteRule ^((urllist|sitemap_).*\.(xml|txt)(\.gz)?)$ vbseo_sitemap/vbseo_getsitemap.php?sitemap=$1 [L]
...
crazyboy1661
10-02-2013, 06:19 AM
bzcomputers, thanks for your information. I just sent a PM to you. Pls find the details.
vBulletin® v3.8.12 by vBS, Copyright ©2000-2025, vBulletin Solutions Inc.