The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
Google sitemap for the vB Archives. Redirect human and robots. Details »» | |||||||||||||||||||||||||||
Google sitemap for the vB Archives. Redirect human and robots.
Developer Last Online: Nov 2023
Release V1.2 (9 Nov 2005)
* Higher sitemap priority rate is given to threads with new posts. So Google can index fresh threads first. * Not recommending the original optional STEP 3 hack. To avoid potential Google penalty, my advice is to remove the STEP 3 hack. Release V1.1a (12 Oct 2005) * Bug fix only Release V1.1 (9 Oct 2005) * Can handle very large forums with more than 50,000 URLs per forum URLs will be spanned through multiple files for each large forum. * Created a function to detect search engine crawlers. The vB built-in search engine detector can only identify about 3 or 4 search engines. My function will detect over 20 search engine crawlers. * Support forums hosted by web servers that do not support 'fix_pathinfo' ie. instead of the usual 'archive/index.php/f-10.html' link. These forums have a link as 'archive/index.php?f-10.html'. * Alert about wrong directory permissions to help newbies. * Automatically write index file to archive directory if the php script can not write into the base vB directory. * Bug fixes. Objectives ==============
Q and A ============== Q. Would the sitemap contain the links for hidden forums? A. No, the forum permission was consulted while generating the sitemap files. Q. How often are the sitemap files generated? A. You decide and set in the Scheduled Tasks. The script can not be called by external user by default to prevent boring people killing your server. Q. Is the sitemap file compressed. A. Yes, the multiple sitemap files are gunziped according to Google sitemap standard to save bandwidth. Sitemap index file is not compressed, it is submitted as a normal xml file. Q. Would the sitemaps include links for the normal threads? eg. showthread.php?t=1234... A. No, it is unlikely Google will index your entire site if you feed it with all the combination of showthread links. It is better to let Google going through the more static archives. You will have a better chance for sure to have more thread contents indexed by Google this way. Q. Why don't you go crazy about rewrite rules and do things like including thread title as the url. A. I won't deny having keywords in the url is a good SEO strategy, but Google also does not like "Over Search Engine Optimized" web sites. Google has recently penalized a huge number of such sites. Sending them from page rank of 5, 6 to 0. Q. Does sitemap really help? A. Definitely, Google has done over 60,000 pages since I submitted my sitemaps a few days ago. Yahoo bots were visiting more pages than Google before the sitemap. I expect the total Google visits for this month will be exceeding Yahoo in the next one or two days. What is involved? ================== I have divided this hack into two steps. The first step involves unloading a php file. This enables the sitemap to be generated and submitted to Google. The second step involves installing a Plugin using AdminCP. This sends all robots to the archive pages, preventing them viewing the actual threads. For example, Google/Other Crawlers follows an external link to visit: http://forums.mysite/showthread.php?t=1234&page=2 It will be told this page is permanently relocated to: http://forums.mysite/archive/index.php/t-1234-p-2 This way you don't lose page rank gain from external links. Install ========= To install, follow the readme file. To let me know you have installed this and let me send update information to you. Please click INSTALL . Strategy ========= It is unlikely Google/other Search Engine will index your entire site, especially due to the dynamic nature of the vbulletin forums. An archive sitemap will let Google concentrate on the real contents of your forums -- the threads. If Google needs to go through the endless member profile pages. It will get sick of it and just become tired.(sorry, perhaps robots can not become tired). What we can do is disallowing the crawling of unneccessary pages. My robots.txt contains: #ALL BOTS User-agent: * Disallow: /admincp/ Disallow: /ajax.php Disallow: /attachments/ Disallow: /clientscript/ Disallow: /cpstyles/ Disallow: /images/ Disallow: /includes/ Disallow: /install/ Disallow: /modcp/ Disallow: /subscriptions/ Disallow: /customavatars/ Disallow: /customprofilepics/ Disallow: /announcement.php Disallow: /attachment.php Disallow: /calendar.php Disallow: /cron.php Disallow: /editpost.php Disallow: /external.php Disallow: /faq.php Disallow: /frm_attach Disallow: /image.php #Disallow: /index.php Disallow: /inlinemod.php Disallow: /joinrequests.php Disallow: /login.php Disallow: /member.php? Disallow: /memberlist.php Disallow: /misc.php Disallow: /moderator.php Disallow: /newattachment.php Disallow: /newreply.php Disallow: /newthread.php Disallow: /online.php Disallow: /payment_gateway.php Disallow: /payments.php Disallow: /poll.php Disallow: /postings.php Disallow: /printthread.php Disallow: /private.php Disallow: /profile.php Disallow: /register.php Disallow: /report.php Disallow: /reputation.php Disallow: /search.php Disallow: /sendmessage.php Disallow: /showgroups.php Disallow: /showpost.php Disallow: /subscription.php Disallow: /usercp.php Disallow: /threadrate.php Disallow: /usercp.php Disallow: /usernote.php You perhaps have noticed I included index.php in there. Apparently Google regards http://forums.mysite/index.html as same as http://forums.mysite/ ...but http://forums.mysite/index.php as a different file. The default vB templates include index.php as the internal link. That will spread your page rank on your home page! So it is better off not letting Google see this file. If you have rewrite installed. Perhaps you could add to the .htaccess file: RewriteCond %{QUERY_STRING} ^$ RewriteRule ^index.php$ / [R=301,L] (if your forums are under http://site/forums/. Try: RewriteRule ^forums/index.php$ forums/ [R=301,L]) That will redirect /index.php to /, but only if no query_string is presented. ie. /index.php?do=mymod will not be redirected. Show Your Support
|
Comments |
#122
|
|||
|
|||
Quote:
I don't really have the energy and time at the moment to write step by step instructions. I will be abroad in a few days time. So I would rather spend time on the actually code. |
#123
|
|||
|
|||
ive managed to get this sort of working but dont understand step 3 it makes no sense whats so ever
any way it submitted my site map and i manually added it to google so i could track it BUT when i click status in google sitemaps it is showing this error: Sitemap Errors HTTP error The server returned an error when we tried to access the URL provided. Please make sure the Sitemap URL is correct and resubmit your Sitemap. whats happened here then? |
#124
|
|||
|
|||
Hi lierduh,
I can't seem to open my /archive map anymore? It loads for a while and than its a white screen forever. I noticed there are alot of zip files there, so what's the deal, do I need to delete/clean out or something, is it getting to big, i can't access it now anymore |
#125
|
|||
|
|||
Quote:
Pink: Removed from old file Yellow: changed lines Green: Added in new file This is very very easy by the way. What is the URL that Google said it is wrong? |
#126
|
|||
|
|||
Quote:
http://www.talk-365.com/forum/archiv...ms_sitemap.php ill try what you said tonight |
#127
|
|||
|
|||
Quote:
|
#128
|
|||
|
|||
Quote:
|
#129
|
|||
|
|||
ok ive upoaded the sitemap to forums directory and submitted taht to google sitemaps admin and see if taht returns anything, does this mean that the actaull hack his failimng then? cos it tells you to put the sitemap file in the archive folder not in forum folder??
im confused also ive opened the files up i the browser for the step 3 but what exactly do i do??? its says the differences from V1.4 what the hell is V1.4 do i add the extra code etc that is missin in 1.4 into my global files or what? this makes no sense im thick |
#130
|
|||
|
|||
Since upgrading I've had major problems with this.. I'm running two forums and have had identical problems on each.
I test out the script and it generates the indexes correctly, however sometime down the road a few crons later the script somehow ends up generating a .gz for EVERY thread on my forum causing my /archive directory to contain thousands of .gz's and my index file to be almost a meg. This seems to be the same problem that Triple_T is having. Any ideas? |
#131
|
|||
|
|||
c'mon guys a little support here, this hack is too good to uninstall
|
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|