Version: 1.00, by vbenhancer
Developer Last Online: Nov 2012
Category: Miscellaneous Hacks -
Version: 3.8.x
Rating:
Released: 03-04-2012
Last Update: Never
Installs: 29
DB Changes Uses Plugins
Additional Files
No support by the author.
A bundle of features for the Spiders access. Hits tracking, complete listing for info, Specific usergroup for Spiders, etc
It's globally a merge with some small addons i wrote in the past, and as i did not want to release a ton of minimal tools that just fit together, i make a real bundle, 4 or 5 tools together, with activation and permissions settings when needed.
Spiders List: a little spiders tracker for your forum. It's not tracking each page the engine is viewing, because this is pointless. Instead, It is listing the name of the spiders that visit your sites, the last date of a visit, the number of unique visits and the number of pages viewed. That information is not very important for the indexation of your site, but it helps to see why your site may be occupied or not. You can then take action if a crawler is visiting and still giving no result on search engines.
Specific Usergroup for Spiders: i released this addon on vb.org long time ago, and it was copied in source, but this version is updated and have more flexibility. You simply have to choose the proper usergroup in the settings so when a spider/crawler visit your site, it is considered having some permissions... it's useful if you do not want to fill your robots.txt file with strange access blocks. This let you give access to crawlers for profiles but not visitors messages, etc...
Also remember to follow the TOS of the search engines you are registered to. Google until lately was blocking sites that were ghosting their content.
Display Spiders in WOL: and in any page showing "Currently Active Users" (showthread, forumdisplay, etc) ... that way, you see where these beasts are visiting..
... some other tools are to be decided to join in the bundle, i'll see later!
CRON JOB:
to make it easier on the server, there is a cronjob storing the hourly stats about the crawlers... once the cronjob is done once (it's the cron named Hourly #1), the stats appear in the right place...:
...update: may 1st, 10:50, a small change, the Crawlers listing will now update the spiders list in cache if the file changed, so you can update it when needed.
someone said to me that vbSEO Googlemap was doing something similar... hum... yeah, similar, tracking 3 web crawlers -- you have to generate new entries by hand if you want to track each crawler...
this engine is compatible with Paul M's Guest tracking... it's not doing exactly the same thing, but you know, you choose... this engine is not tracking guests activity, just crawlers page hits. (adding a small query per page, but useful when you REALLY need to know what web crawlers are doing on your site) -- really good for site owners showing potential for paid Ads!
Also, there will be a newer version with more settings soon... permissions and blocking, etc
...
as a success story, i must say i had a great visitor this week, that i would never been able to track without this engine... the "Majestics MJ12bot" crawler make 20 times the hits on my site that Google was able to do in months... i checked their site, and it was obvious they were trying to leech the site, not crawl it...