![]() |
Quote:
|
all 3 steps installed :)
and all working fine (hopefully!!!) Thanks for the hack :) |
been getting a few php errors in my error log:
error is PHP Fatal error: Class 'vBulletinHook' not found in /home/trevor/public_html/forums/includes/functions.php on line 4322 I think its reffering to this bit of code Code:
function exec_header_redirect($url) im checking my logs now to see if its not working the other way around, any ideas as to whats causing this error? |
I got this error now:
A Sitemap Index may not directly or indirectly reference itself. Please fix your Sitemap Index before resubmitting. How does this Google Sitemap hack compare to the other one on vB? |
What is the other one? (URL?)
|
I got something similar but couldn't see anything wrong with the sitemap... so I posted on the Google Groups with it:
http://groups.google.com/group/googl...7e0aca343fb8ec Similar error: Code:
Recursive Index |
Just checked today and got the same error as you! Hmm...
|
when running sitemap.php i get this
Script can only be run by vB Scheduled Tasks. Set $run_by_vb_Scheduled_Task_only to 0 to call this script directly. i dont understand this bit how do i change it to what its asking :) |
Quote:
change: $run_by_vb_Scheduled_Task_only = 0; to: $run_by_vb_Scheduled_Task_only = 1; Now you (and everyone else) can run the script directly. You can also leave it to 0 and run this script in your scheduled tasks manager, that way, you are the only person who can run it. |
BTW the beta script is working fine AFAICS
Quote:
|
Where is the link to the beta script? Is it burried on one of the many pages?
Quote:
|
yeah beta script is few pages back
|
those wanting better bot detection may want to try the mod I recommend in this post:
http://www.vbulletin.com/forum/showt...396#post993396 This will make use of the spiders XML file that so many work so hard on. |
I've written the instructions for step 3 in a step by step txt file.
This is an alternative for the coloured diff. Should only be used on a untouched index.php and global.php file. |
Lierduh,
Is it possible to exclude forums from the sitemap? I don't want to get my chat section on google listed too mutch for instance. Thanks |
Hi,
Anybody know how to set the premissions on a Windows server running IIS. It will offcause be easy enough to give access for internet_user, but can you give the primission only for this job? /Peter |
Quote:
|
Quote:
/Peter |
Quote:
|
Ok its prolly a stupid thing but all the spiders view the archives like this:
/forums/archive/index.php/t-1192.html when it should be /forums/archive/index.php?t-1192.html |
chatbum - actually the proper way is the first that you showed with slashes only. The reason is that it emulates directories and doesn't use a query string (making it look more static to spiders).
The reason your server might be using the query string method is because yours doesn't support the first method. (Check your /archive/global.php file where it checks for SLASH_METHOD.) I *thought* he implemented the check and both options in the archive generation code, but maybe it might not be working properly. *shrug* |
Quote:
|
Quote:
|
Quote:
|
Any idea on when this will get another update? It works for the most part but that recursive index problems keeps happening.. :(
|
Quote:
|
I don't know what the reason is. I do not have this problem so far. Two things to try:
1) Have you deleted the old sitemap entry from Google sitemap account? 2) Try to change the sitemap index file name. |
Quote:
However, I understand that this is a very personal request, so please contact me if you want to do a payed service for this. thank you |
Well - I have everything done except step 3. I am completely lost in editing my global and index .php files Can anyone DO them for me or something? Would like to get this done asap if at all possible. Would be hugely appreciated.
|
Quote:
I posted a step by step instruction here Still problems? Send me your msn , I'll help you out. |
i have a small question:
is it possible to add additional pages to the sitemap? |
I have a feature request... to dump a single text file, gzipped, of ALL of the urls that go into the various sitemaps.
Basically... when we're in the loop to create the various sitemaps, to additionally write a text file, with just one full URL per line. This is because this would also be good for Yahoo and other spiders. Yahoo specifically asks for such a thing on their submit page: http://submit.search.yahoo.com/free/request Quote:
|
Don't know but maybe this is over my head here :p
Went through and made changes 1 and 2 and that is all and I have no idea to check and see if it is working. I have no errors so that is a plus. Do I have to sign up for Google Sitemap? - Forget I said that. |
Just a warning here... I was reading about seo and step 3 might not be such a good idea. This is called cloaking and it is a black hat technique. It is clearly stated in the google quality terms as a forbidden way to SEO.
Actually I found out after installing this, my pagerank for my homepage did go down 4 pages for a very important keyword, I didn't do anything else that could be suspicious so I removed this! The sitemap is good, but optimizing pages just for search engines and make them look different from what you human visitors see, is NOT recommended and you take a high risk for being penalized by google or other SE. |
Quote:
There are many, many things that you can do that are considered cloaking. I don't think google would flip out over this seeing as the content that is provided to the search engine spider is the same content that is provided to the human user. An example of an abuse of cloaking is where, say, a completely different set of content is given to the spider than that which is given to the human. I'm going to pubcon (http://www.pubcon.com) in a couple weeks; I'll ask around to see what some SEO's think about what we're doing here. I can even ask guys at yahoo and google. Personally, I think it's fine, since the same core content is being given to the search engines and humans. |
The problem is, what you think is the "same content" is different than what a spider thinks is the same. Yes, cloaking is a serious issue and search engines do penaltize sites for doing such. Some search engines (like google) have spiders that look like a regular web broswer so that it can compare results between it and the actual spider results. If they don't match then, well, you get the idea.
I expanded my robots.txt file to exclude a lot of the links that are listed in the notes. And I use the generator script to make the xml files for google, but that's it. I do not believe it trying to redirect bots or users to various pages, that will only end up with bad things happening. |
Cloaking or not, is a long debating topic. The general advise from the experts is to not cloaking due the risk it involves. I would say do not install the step 3 if you are concerned about this.
However nowadays many major sites use cloaking including Amazon and Google itself. Believe or not vBulletin also uses cloaking! |
Yes, but Amazon is a much more reputable site than say, joe bob's bait shack... Plus companies like that work directly with Google to enhance features for both sites.
|
Well! I've backed-out the cloaking after the number of my indexed pages on google went from >40,000 to just over 800. I'm assuming that we got penalized in some form. My PR is still a 5, but that doesn't mean much of anything at all.
I can honestly say that my opinion is reversed on the cloaking side of things. I do not recommend implementing step 3. |
What exactly was step 3 of the hack? I looked over the installation and didn't see a "step 3"
|
All times are GMT. The time now is 04:06 AM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|