vb.org Archive

vb.org Archive (https://vborg.vbsupport.ru/index.php)
-   vBulletin 3.5 Add-ons (https://vborg.vbsupport.ru/forumdisplay.php?f=113)
-   -   Google sitemap for the vB Archives. Redirect human and robots. (https://vborg.vbsupport.ru/showthread.php?t=93980)

D|ver 10-27-2005 10:14 PM

i have a small question:
is it possible to add additional pages to the sitemap?

buro9 10-28-2005 01:07 PM

I have a feature request... to dump a single text file, gzipped, of ALL of the urls that go into the various sitemaps.

Basically... when we're in the loop to create the various sitemaps, to additionally write a text file, with just one full URL per line.

This is because this would also be good for Yahoo and other spiders. Yahoo specifically asks for such a thing on their submit page:
http://submit.search.yahoo.com/free/request
Quote:

You can also provide the location of a text file containing a list of URLs, one URL per line, say urllist.txt. We also recognize compressed versions of the file, say urllist.gz.
And to me... it seems that the loop to create the Google Sitemap, is the perfect low overhead place to also dump the Archive URL's into a text file for Yahoo and other spiders to feed from.

:Judge: 10-28-2005 07:58 PM

Don't know but maybe this is over my head here :p

Went through and made changes 1 and 2 and that is all and I have no idea to check and see if it is working.

I have no errors so that is a plus.

Do I have to sign up for Google Sitemap? - Forget I said that.

dutchbb 10-29-2005 09:52 AM

Just a warning here... I was reading about seo and step 3 might not be such a good idea. This is called cloaking and it is a black hat technique. It is clearly stated in the google quality terms as a forbidden way to SEO.

Actually I found out after installing this, my pagerank for my homepage did go down 4 pages for a very important keyword, I didn't do anything else that could be suspicious so I removed this!

The sitemap is good, but optimizing pages just for search engines and make them look different from what you human visitors see, is NOT recommended and you take a high risk for being penalized by google or other SE.

falter 10-29-2005 11:13 PM

Quote:

Originally Posted by Triple_T
Just a warning here... I was reading about seo and step 3 might not be such a good idea. This is called cloaking and it is a black hat technique. It is clearly stated in the google quality terms as a forbidden way to SEO.

Actually I found out after installing this, my pagerank for my homepage did go down 4 pages for a very important keyword, I didn't do anything else that could be suspicious so I removed this!

The sitemap is good, but optimizing pages just for search engines and make them look different from what you human visitors see, is NOT recommended and you take a high risk for being penalized by google or other SE.

if you impelemented the robots.txt that is suggested, that is most likely the cause of your PR drop, and not due to the cloaking.

There are many, many things that you can do that are considered cloaking. I don't think google would flip out over this seeing as the content that is provided to the search engine spider is the same content that is provided to the human user. An example of an abuse of cloaking is where, say, a completely different set of content is given to the spider than that which is given to the human.

I'm going to pubcon (http://www.pubcon.com) in a couple weeks; I'll ask around to see what some SEO's think about what we're doing here. I can even ask guys at yahoo and google. Personally, I think it's fine, since the same core content is being given to the search engines and humans.

eoc_Jason 11-01-2005 04:23 PM

The problem is, what you think is the "same content" is different than what a spider thinks is the same. Yes, cloaking is a serious issue and search engines do penaltize sites for doing such. Some search engines (like google) have spiders that look like a regular web broswer so that it can compare results between it and the actual spider results. If they don't match then, well, you get the idea.

I expanded my robots.txt file to exclude a lot of the links that are listed in the notes. And I use the generator script to make the xml files for google, but that's it. I do not believe it trying to redirect bots or users to various pages, that will only end up with bad things happening.

lierduh 11-04-2005 10:20 PM

Cloaking or not, is a long debating topic. The general advise from the experts is to not cloaking due the risk it involves. I would say do not install the step 3 if you are concerned about this.

However nowadays many major sites use cloaking including Amazon and Google itself. Believe or not vBulletin also uses cloaking!

eoc_Jason 11-07-2005 04:01 PM

Yes, but Amazon is a much more reputable site than say, joe bob's bait shack... Plus companies like that work directly with Google to enhance features for both sites.

falter 11-08-2005 09:27 PM

Well! I've backed-out the cloaking after the number of my indexed pages on google went from >40,000 to just over 800. I'm assuming that we got penalized in some form. My PR is still a 5, but that doesn't mean much of anything at all.

I can honestly say that my opinion is reversed on the cloaking side of things. I do not recommend implementing step 3.

Citizen 11-08-2005 11:22 PM

What exactly was step 3 of the hack? I looked over the installation and didn't see a "step 3"


All times are GMT. The time now is 04:02 PM.

Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.

X vBulletin 3.8.12 by vBS Debug Information
  • Page Generation 0.02272 seconds
  • Memory Usage 1,741KB
  • Queries Executed 10 (?)
More Information
Template Usage:
  • (1)ad_footer_end
  • (1)ad_footer_start
  • (1)ad_header_end
  • (1)ad_header_logo
  • (1)ad_navbar_below
  • (2)bbcode_quote_printable
  • (1)footer
  • (1)gobutton
  • (1)header
  • (1)headinclude
  • (6)option
  • (1)pagenav
  • (1)pagenav_curpage
  • (4)pagenav_pagelink
  • (1)pagenav_pagelinkrel
  • (1)post_thanks_navbar_search
  • (1)printthread
  • (10)printthreadbit
  • (1)spacer_close
  • (1)spacer_open 

Phrase Groups Available:
  • global
  • postbit
  • showthread
Included Files:
  • ./printthread.php
  • ./global.php
  • ./includes/init.php
  • ./includes/class_core.php
  • ./includes/config.php
  • ./includes/functions.php
  • ./includes/class_hook.php
  • ./includes/modsystem_functions.php
  • ./includes/class_bbcode_alt.php
  • ./includes/class_bbcode.php
  • ./includes/functions_bigthree.php 

Hooks Called:
  • init_startup
  • init_startup_session_setup_start
  • init_startup_session_setup_complete
  • cache_permissions
  • fetch_threadinfo_query
  • fetch_threadinfo
  • fetch_foruminfo
  • style_fetch
  • cache_templates
  • global_start
  • parse_templates
  • global_setup_complete
  • printthread_start
  • pagenav_page
  • pagenav_complete
  • bbcode_fetch_tags
  • bbcode_create
  • bbcode_parse_start
  • bbcode_parse_complete_precache
  • bbcode_parse_complete
  • printthread_post
  • printthread_complete