~Modification description
This modification allows you to set view permissions for the spiders/bots on your website or you can place them in your desired usergroup.
I know, phpBB has an ability to define different permission for the Bots and guests, and I've never heard any phpBB forum has been banned by Google yet.
Cutts explained that when Google's crawlers visited a BMW page, it saw blocks of text with repeated key search words such as "neuwagen," which means "new car" in German. However, when a user visited the listed page they would be automatically redirected to another page with less text and more pictures, which was more attractive than the page the crawler saw, but would have scored lower in Google's PageRank system.
"This is a violation of our Webmaster quality guidelines, specifically the principle of 'Don't deceive your users or present different content to search engines than you display to users,'" Cutts' blog said.
That supports the interpretation that this mod is OK.
It seems obvious to me that Google's intent is to ensure people don't mislead search engines with content that doesn't exist, such as when you click on a google result you get taken to a malicious website, not the website and content you wanted to see.
Locking content behind a username/password is not changed, malicious or misleading content.
As was mentioned earlier in this thread, newspapers and television stations regularly do this. A news story will be visible in a search result, but when you click the link to the website, you can't see it unless you register and log in.
That said, I have a question about this mod.
My website requires users to log in. This mod allows you to assign search bots to a usergroup, but does it bypass the login requirement? In otherwords, does it show bots private forums without them logging in?
#
ACP -> vBulletin Options -> KX - Spider -> Select yes on view threads. #
ACP -> Style Manager -> Your Template -> BB Code Layout -> bbcode_code
and # accomplish this change too :
edit "/includes/xml/spiders_vbulletin.xml"
in topic, above the links show HyperLink Check Download Links , but when i click on Check Download Links in new page (on your website) it don't check any of my links in my topic .
why ?
please help
Google specifically asks webmasters to give Googlebot access to restricted pages. I have seen this at various Google webmaster blogs and webmaster help pages. However, you must show the whole access page/thread to the visitors. Any other pages may be restricted from the visitors.
Webmasters wishing to implement First Click Free should follow these guidelines:
All users who click a Google search result to arrive at your site should be allowed to see the full text of the content they're trying to access.
The page displayed to all users who visit from Google must be identical to the content that is shown to Googlebot.
If a user clicks to a multi-page article, the user must be able to view the entire article. To allow this, you could display all of the content on a single page—you would need to do this for both Googlebot and for users. Alternately, you could use cookies to make sure that a user can visit each page of a multi-page article before being asked for registration or payment.
First Click Free is also what webmasterworld.com is using.
I hope this clears it up.
Ziki would you consider to add First Click Free functionality?