vb.org Archive

vb.org Archive (https://vborg.vbsupport.ru/index.php)
-   vBulletin 3.5 Add-ons (https://vborg.vbsupport.ru/forumdisplay.php?f=113)
-   -   Google sitemap for the vB Archives. Redirect human and robots. (https://vborg.vbsupport.ru/showthread.php?t=93980)

PixelFx 10-06-2005 06:16 PM

I've tried to do the compare html, but its much harder than just making a txt file, saying remove here, add here, place under this, place over that etc. As far as I know this hack is implete or just not designed for 3.5.0 .. although I'm sure its a good hack for RC3 users, gold I can't get it to work.

lierduh 10-06-2005 09:16 PM

Quote:

Originally Posted by PixelFx
Ok I know chmod, and set my archive/internal files to 777 .. I also did my root as 750 ..
any sugguestions?

We are really having fun with permissions, aren't we?
Your permissions are wrong, and by the sound of it, you do not fully understand chmod.:) The good thing is at least you explained what you have done instead of saying: "I have the errors, tell me how to fix".

Please list your directory permissions here. (two #ls -l output) Let me guess, you changed the permission for the files, not the DIRECTORIES as my instruction. You said you changed your root? (I highly doubt,:) do you mean '/' or the base directory of your vB install? When you have 750, what are the ownership of the directory? User ownership and group owership? It is very weird to set a web directory to 750, usually you would need to at least have 751, or 755.

=============
Instruction I will include in the next readme file:

The script will need to write files to two directories.

1) The base vB directory
2) the archive directory.

You will need to change the DIRECTORY permissions for these two directories.

Let's presume your directory structure is:

~public_html
('~' means it is under your user home directory, the acture directory should be something like: /home/your_isp_user_name/public_html)
~public_html/showthread.php
...
~public_html/archive/
~public_html/archive/index.php
...

You need to do:

1) change the permission for the vB base directory. So
#chmod 777 public_html
(or #chmod 777 ~public_html if you are not already under your home directory)

2) change the permission for the archive directory. Do
#chmod 777 public_html/archive
(or #chmod 777 ~public_html/archive
or #chmod 777 archive
if you use some kind of web based control panel)

=============
I really wish someone who can write a better instruction for changing directory permissions for me. 50% of the problems using this hack are permission related.

lierduh 10-06-2005 09:38 PM

Quote:

Originally Posted by Triple_T
What weblog is there in vbulletin, if you mean currently active users: i don't see it there.

I meant the web server log. Please ask your host about it. You ask them where can you access to the web log for your web site they host.

PixelFx 10-07-2005 07:12 AM

I have cpanel system, was using wsftp to chmod the directories. As per your instructions. by Root I ment public_html .. its a cpanel permission thing, as base permission. anyway I did what you asked in your intructions(readme, not whats above), but for some reason it was greifing, I've never had an issue with chmod till trying to do this script, I figoured it was RC3 issue not vb3.5.0 .. I'll give it a shot again with your info about :)

.. in instructions above, the only thing I missed was permissions on /forum/ from how you explained it above. I'll try again and post results.

my setup is pubic_html/forum/archive .. your saying in your setup in relation to mine to do /forum/ = 777 and /archive/ = 777, which is what I'll try next, currently I'm in sleep typing mode.

dutchbb 10-07-2005 10:55 AM

Quote:

Originally Posted by buro9
To those unsure about this hack, I would say persevere.

My number of Google spiders hasn't increased dramatically, but they're being far more efficient.

A month ago the number of pages I had in Google was only 66,000, now I have over 846,000 pages indexed: http://www.google.co.uk/search?q=site%3Awww.bowlie.com

It really is worth it, although the code edits in archive/index.php can be a bugger to get your head around at first.

I second that!

It's been a few days now, guess what

http://www.google.be/search?hl=nl&q=...e+zoeken&meta=

9.640 listed already, coming from 642 just 2 days ago!!!!!!

Man this hack rocks, you deserve to win hack of the month + hack of the year for all I care :D

ps: just need a better explanation for step 3, I couldn't do that one.

buro9 10-07-2005 12:28 PM

I have a problem though... you're making a sitemap gz for each forum, well, some of my forums are big:
Quote:

Sitemap Errors

1. Too many URLs with Sitemap http://www.bowlie.com/forum/archive/sitemap_4.gz
* Your Sitemap contains too many URLs. Please create multiple Sitemaps with up to 50000 URLs each and submit all Sitemaps.
Could you add spanning?

So we'd start with:
http://www.bowlie.com/forum/archive/sitemap_4_1.gz

And when we passed an arbitrary value (make it a setting in the file in case Google change it later) we would move onto:
http://www.bowlie.com/forum/archive/sitemap_4_2.gz
http://www.bowlie.com/forum/archive/sitemap_4_3.gz
through
http://www.bowlie.com/forum/archive/...p_4_9999999.gz
etc

As it stands, Google is now refusing to pay attention to my mine as the one that exceeds it basically causes the whole thing to error.

buro9 10-07-2005 12:30 PM

Quote:

Originally Posted by buro9
I have a problem though... you're making a sitemap gz for each forum, well, some of my forums are big:


Could you add spanning?

So we'd start with:
http://www.bowlie.com/forum/archive/sitemap_4_1.gz

And when we passed an arbitrary value (make it a setting in the file in case Google change it later) we would move onto:
http://www.bowlie.com/forum/archive/sitemap_4_2.gz
http://www.bowlie.com/forum/archive/sitemap_4_3.gz
through
http://www.bowlie.com/forum/archive/...p_4_9999999.gz
etc

As it stands, Google is now refusing to pay attention to my mine as the one that exceeds it basically causes the whole thing to error.

Oh, and mine contains 50,214 URL's ;) That one forum :D

avexzero 10-07-2005 05:06 PM

Does this workd on 3.5.0 Gold?

Unreal Player 10-07-2005 05:20 PM

What are we looking for when this script runs. I see alot of zip files or whatever in the archive section but nothing in the other sections..is this right

dutchbb 10-07-2005 05:29 PM

Quote:

Originally Posted by avexzero
Does this workd on 3.5.0 Gold?

yep up and running

lierduh 10-07-2005 11:58 PM

This is something I had in mind to implement.:) So next version will certainly contain this feature.

I think I should be able to push a new version out this weekend including better documentation for the step 3. I have been waiting for the vB Gold.

Quote:

Originally Posted by buro9
I have a problem though... you're making a sitemap gz for each forum, well, some of my forums are big:


Could you add spanning?

So we'd start with:
http://www.bowlie.com/forum/archive/sitemap_4_1.gz

And when we passed an arbitrary value (make it a setting in the file in case Google change it later) we would move onto:
http://www.bowlie.com/forum/archive/sitemap_4_2.gz
http://www.bowlie.com/forum/archive/sitemap_4_3.gz
through
http://www.bowlie.com/forum/archive/...p_4_9999999.gz
etc

As it stands, Google is now refusing to pay attention to my mine as the one that exceeds it basically causes the whole thing to error.


Unreal Player 10-08-2005 01:23 AM

is it normal for my site to still be PENDING after 6 hours at google. And how does my site know what account i'm using to resubmit it automatically?

dutchbb 10-08-2005 06:06 PM

Quote:

Originally Posted by lierduh
This is something I had in mind to implement.:) So next version will certainly contain this feature.

I think I should be able to push a new version out this weekend including better documentation for the step 3. I have been waiting for the vB Gold.

HI

Google Spider still only looks at the normal threads in who's online?

Only the Yahoo! Slurp Spider looks at the archives?

falter 10-08-2005 11:06 PM

Hi there,
I'm very happy with the archive redirection. That's pretty slick stuff, and it seems to be working great. The sitemap submission to google hasn't really taken effect quite yet, but it's only be 36 hours since submission (I imagine that these things can take some time). Yahoo is going bonkers on us, though!

Anyway, I've submitted a bug/feature request to vbulletin as a result of installing this mod. You can see it here:
http://www.vbulletin.com/forum/bugs3...iew&bugid=1576

Specifically, it has to do with the way in which $show['search_engine'] is defined, which seems important as it plays quite an important role in this particular mod.

Looking at the definition of $show['search_engine'] seemed important as I, like others, have noticed that sometimes googlebot doesn't want to get redirected from showthread to the archives.

(as seen in /includes/init.php)
Code:

$show['search_engine'] = ($vbulletin->superglobal_size['_COOKIE'] == 0 AND preg_match("#(google|msnbot|yahoo! slurp)#si", $_SERVER['HTTP_USER_AGENT']));
As you can see, the vBulletin assumes that no search engine spider will ever use a cookie. I found the redirection to be more effective after removing the checking for the absence of a cookie, which resulted in this:
Code:

$show['search_engine'] = (true AND preg_match("#(google|msnbot|yahoo! slurp)#si", $_SERVER['HTTP_USER_AGENT']));
Now, as you can see in my bug report, I'm not terribly satisfied with the way $show['search_engine'] is defined in the first place, but making the mod as seen above helped me out, some.

Hope this helps some of you guys...

~mike.

falter 10-08-2005 11:12 PM

Quote:

Originally Posted by Triple_T
HI

Google Spider still only looks at the normal threads in who's online?

Only the Yahoo! Slurp Spider looks at the archives?

Triple_T,
Just for clarity's sake, I was having the same problem you are having. Try my mod (in the post above this one), and see if that helps.

~mike

dutchbb 10-09-2005 02:00 AM

Quote:

Originally Posted by falter
Triple_T,
Just for clarity's sake, I was having the same problem you are having. Try my mod (in the post above this one), and see if that helps.

~mike

I looked right after and 1 x google was in the archives. After that it was still also in the threads.

I noticed google is mutch less effective in comparison:
- only 1 spider most of the time (yahoo 10 or more)
- yahoo is now always in the archives, googlebot almost always not
- googlebot still goes to pages like printthread and member.php , and that even with a robot.txt disallowing that to happen.

MSN bot has not gone further than index.php, so looks like yahoo is just a better bot?

Now I have 2 questions regarding robots.txt:

- I have one both in the site root en the vbulletin root, is this needed , if not, what is the correct place (from what I have read it should be the site root)

- Is the .php extention needed for disallowing files, some say it's best to not include it, i have not seen a difference so far.

jdingman 10-09-2005 03:34 AM

Looks great so far. One question about mod_rewrite

using
Quote:

RewriteCond %{QUERY_STRING} ^$
RewriteRule ^index.php$ / [R=301,L]
that redirects if you're using forums.domain.com. What about if you're using domain.com/forums/? What mod_rewrite would you use for that redirect?

(not exactly for me because I can probably get it working, but anyone else that might need this as well.)

falter 10-09-2005 03:51 AM

Quote:

Originally Posted by Triple_T
Now I have 2 questions regarding robots.txt:

- I have one both in the site root en the vbulletin root, is this needed , if not, what is the correct place (from what I have read it should be the site root)

- Is the .php extention needed for disallowing files, some say it's best to not include it, i have not seen a difference so far.

your robots.txt should be accessible at the root of your domain (http://www.mydomain.com/robots.txt). this is the only place that spiders know to check.

if you're trying to explicitly define specific files (ex. /forums/showthread.php), then you should define that entry in your robots.txt file. there's no point in not putting the ".php" at the end (ex. /forums/showthread), it doesn't buy you anything. it can actually have a negative impact if your entries aren't defined well. say you're trying to tell search engines to ignore "/forum/s.php" (this is just hypothetical). if you were to just put "/forum/s" in your robots.txt, then, in addition to blocking "/forum/s.php", you'd be blocking "/forum/showthread.php", "/forum/search.php", "/forum/showgroups.php", anything else where the url starts with "/forum/s" .... as you can see, it's important to be as specific as possible, otherwise you risk shutting spiders out of huge chunks of your site.

falter 10-09-2005 03:59 AM

Quote:

Originally Posted by Triple_T
I looked right after and 1 x google was in the archives. After that it was still also in the threads.

I noticed google is mutch less effective in comparison:
- only 1 spider most of the time (yahoo 10 or more)
- yahoo is now always in the archives, googlebot almost always not
- googlebot still goes to pages like printthread and member.php , and that even with a robot.txt disallowing that to happen.

i've thought about it some more.
301 code just tells the bot that the link has permanently moved. it would take a second request from the spider to actually jump to the archives. if the spider is slow (as googlebot and msnbot typically are), i can see how it would appear as though googlebot was sitting in showthread, instead of being directed to the archive....

lierduh 10-09-2005 05:42 AM

I have a new version ready to be released. If anyone wants, you can download this and try out before I put together the package.

I still need to do the documentation for the modifications of index.php and global.php files.

lierduh 10-09-2005 05:46 AM

Quote:

Originally Posted by jdingman
Looks great so far. One question about mod_rewrite

using that redirects if you're using forums.domain.com. What about if you're using domain.com/forums/? What mod_rewrite would you use for that redirect?

(not exactly for me because I can probably get it working, but anyone else that might need this as well.)

Without testing, I think
RewriteRule ^forums/index.php$ forums/ [R=301,L]

Should do.

falter 10-09-2005 06:24 AM

Quote:

Originally Posted by lierduh
I have a new version ready to be released. If anyone wants, you can download this and try out before I put together the package.

I still need to do the documentation for the modifications of index.php and global.php files.

I don't know if this is due to any mods I have (which I'm pretty light on), but when I run your script directly (not using cron), I get the following output:

Quote:

Warning: array_keys(): The first argument should be an array in /path/to/my/stuff/forums/includes/class_core.php on line 1453

Warning: Invalid argument supplied for foreach() in /path/to/my/stuff/forums/includes/class_core.php on line 1453

Warning: array_keys(): The first argument should be an array in /path/to/my/stuff/forums/includes/class_core.php on line 1472

Warning: Invalid argument supplied for foreach() in /path/to/my/stuff/forums/includes/class_core.php on line 1472

Unable to add cookies, header already sent.
File: /path/to/my/stuff/forums/archive/forums_sitemap.php
Line: 1

Removing the "unset($_COOKIE);" from line 56 helps get the script to run, but, since my cookies are still there, all my private forums get sitemapped, too. so, I just moved down the stuff in the block above, and everything works.

So, I go from this:
PHP Code:

if (function_exists('log_cron_action'))
{
    global 
$index_zp;
    global 
$debug_log;
    global 
$max_url;
    unset(
$vbulletin->userinfo);
    
$vbulletin->userinfo['userid'] = 0;
}
else
{
    if (
$run_by_vb_Scheduled_Task_only)
    {
        exit(
"Script can only be run by vB Scheduled Tasks. Set \$run_by_vb_Scheduled_Task_only to 0 if you need to run manually");
    }

    unset(
$_COOKIE);
    
$specialtemplates = array();
    require_once(
CWD '/includes/init.php');


to this

PHP Code:

if (function_exists('log_cron_action'))
{
    global 
$index_zp;
    global 
$debug_log;
    global 
$max_url;
    unset(
$vbulletin->userinfo);
    
$vbulletin->userinfo['userid'] = 0;
}
else
{
    if (
$run_by_vb_Scheduled_Task_only)
    {
        exit(
"Script can only be run by vB Scheduled Tasks. Set \$run_by_vb_Scheduled_Task_only to 0 if you need to run manually");
    }

    
$specialtemplates = array();
    require_once(
CWD '/includes/init.php');
    unset(
$vbulletin->userinfo);
    
$vbulletin->userinfo['userid'] = 0;



lierduh 10-09-2005 07:04 AM

I remember now, someone else reported this as well. I think it might be php5 related. I don't have php5 to test, so I think I won't unset cookies then.:)

Thanks.

Quote:

Originally Posted by falter
I don't know if this is due to any mods I have (which I'm pretty light on), but when I run your script directly (not using cron), I get the following output:



Removing the "unset($_COOKIE);" from line 56 helps get the script to run, but, since my cookies are still there, all my private forums get sitemapped, too. so, I just moved down the stuff in the block above, and everything works.


dutchbb 10-09-2005 11:00 AM

Quote:

Originally Posted by falter
your robots.txt should be accessible at the root of your domain (http://www.mydomain.com/robots.txt). this is the only place that spiders know to check.

if you're trying to explicitly define specific files (ex. /forums/showthread.php), then you should define that entry in your robots.txt file. there's no point in not putting the ".php" at the end (ex. /forums/showthread), it doesn't buy you anything. it can actually have a negative impact if your entries aren't defined well. say you're trying to tell search engines to ignore "/forum/s.php" (this is just hypothetical). if you were to just put "/forum/s" in your robots.txt, then, in addition to blocking "/forum/s.php", you'd be blocking "/forum/showthread.php", "/forum/search.php", "/forum/showgroups.php", anything else where the url starts with "/forum/s" .... as you can see, it's important to be as specific as possible, otherwise you risk shutting spiders out of huge chunks of your site.

Thank you. I read it on this site, the guy seems to be some sort of guru about vbulletin SEO: http://forum.time2dine.co.nz/seo-vbu...lletin-98.html

I have a few questions (also for the author of this thread: )

What does http://www.vbseo.com have that this hack doesn't provide. Is this worth buying, or is it basically the same?

What do you think about the tips/hack provided on this site: http://forum.time2dine.co.nz/seo-vbu...lletin-98.html he has nr1 ranking on google for "vbulletin SEO" keywords.

lierduh 10-09-2005 11:32 AM

Basically my hack only lets Google index the real contents of the forums using vB archives. I do not think it is neccessary to let Google index both the full version threads and the archives. For more details and reasons, please read my open post.

Unreal Player 10-09-2005 01:53 PM

Ok, My site has been pending for almost 2 days. They say "several hours" wtf? anyone else get this?

jdingman 10-09-2005 02:16 PM

Is it crucial that I change permission for the root or my forum directory? I haven't changed them and it's been working fine. I did change my /archive/ to 755, but not ./

does it make that much of a difference?

trilljester 10-09-2005 04:01 PM

Quote:

Originally Posted by jdingman
Is it crucial that I change permission for the root or my forum directory? I haven't changed them and it's been working fine. I did change my /archive/ to 755, but not ./

does it make that much of a difference?

Well, as long as the web server "user" process has access to write to the root forum directory and archive/ then 755 is fine, assuming that the user owns the directories. The 55 part will keep others from writing to those directories.

xtreme-mobile 10-09-2005 04:50 PM

ummm all is goinbg well but what the helldo i have to do for step 3 it doesnt make any sense to me :(

any help would be fantastic :D

falter 10-09-2005 04:54 PM

hey lierduh,

I've been playing around a bit with the robot detection. I snagged a bunch of code from "online.php", hacked it up a bit, and came up with this (as a drop-in replacement for the "is_robot_visit" function. This one uses the spiders_vbulletin.xml file, which I recommend people updating. The 3.5.0 gold version is fairly vanilla. I got an updated one from here: http://www.vbulletin.com/forum/showp...5&postcount=12

Anyway, here's the change to global.php (this is assuming that you have the very latest version of lierduh's code :) )

PHP Code:

 /**
    * Return true if visited by a robot.
    */
    
function is_robot_visit()
    {
        require_once(
DIR '/includes/class_xml.php');
        
$xmlobj = new XMLparser(falseDIR '/includes/xml/spiders_vbulletin.xml');
        
$spiderdata $xmlobj->parse();
    
        if (
is_array($spiderdata['spider']))
        {
            foreach (
$spiderdata['spider'] AS $spiderling)
            {
                if (isset(
$_SERVER['HTTP_USER_AGENT']) AND preg_match("#"preg_quote($spiderling['ident'], '#') . "#si"$_SERVER['HTTP_USER_AGENT'])) {
                    return 
true;
                }
            }
        }
        unset(
$spiderdata$xmlobj);
        return 
false;
    } 

There's all sorts of extra markup in the xml for ip ranges and such, but I'm just goign to match against the user-agents, for now.

falter 10-09-2005 05:09 PM

update: I went so far as moving the code out of "archive/global.php" into "includes/init.php", where $show['search_engine'] is defined.

I replaced:
PHP Code:

$show['search_engine'] = ($vbulletin->superglobal_size['_COOKIE'] == AND preg_match("#(google|msnbot|yahoo! slurp)#si"$_SERVER['HTTP_USER_AGENT'])); 

with
PHP Code:

 /**
    * Return true if visited by a robot.
    */
    
function is_robot_visit()
    {
        require_once(
DIR '/includes/class_xml.php');
        
$xmlobj = new XMLparser(falseDIR '/includes/xml/spiders_vbulletin.xml');
        
$spiderdata $xmlobj->parse();
    
        if (
is_array($spiderdata['spider']))
        {
            foreach (
$spiderdata['spider'] AS $spiderling)
            {
                if (isset(
$_SERVER['HTTP_USER_AGENT']) AND preg_match("#"preg_quote($spiderling['ident'], '#') . "#si"$_SERVER['HTTP_USER_AGENT'])) {
                    return 
true;
                }
            }
        }
        unset(
$spiderdata$xmlobj);
        return 
false;
    }

$show['search_engine'] = is_robot_visit(); 

Everything works great!

dutchbb 10-09-2005 05:59 PM

You guys have time for "the redirect to the actual thread for human visitors" instructions in step 3?

Or maybe you can just send the 2 files? The ones in the zip don't make any sence to me :(

falter 10-09-2005 06:22 PM

if lierduh could provide patchable diff's between the gold (3.5.0) and his modifications, that'd be awesome (I'd do it, but I've hacked up my files way too much, sorry).

lierduh, this page shows how: http://www2.linuxjournal.com/article/1237

it'll make modifying the vanilla files much easier.

xtreme-mobile 10-09-2005 07:06 PM

Quote:

Originally Posted by Triple_T
You guys have time for "the redirect to the actual thread for human visitors" instructions in step 3?

Or maybe you can just send the 2 files? The ones in the zip don't make any sence to me :(


same here i aint got a clue what to do :(

lierduh 10-09-2005 10:36 PM

Thanks falter. I will merge your code in my next release. With the Stadler's updated xml file. It will be better using the xml file than my string. I should be able to make that into plugins. So that no code needs to be changed regarding search engine crawler detection.

I also have an idea that I should make threads with new posts higher sitemap priority. So a new version is not far away.:)

It is good to see hackers work together creating better code. It makes it more interesting than explaining how to set permissions. :)

I have attached patch.diff for the two files.

Quote:

Originally Posted by falter
hey lierduh,

I've been playing around a bit with the robot detection. I snagged a bunch of code from "online.php", hacked it up a bit, and came up with this (as a drop-in replacement for the "is_robot_visit" function. This one uses the spiders_vbulletin.xml file, which I recommend people updating. The 3.5.0 gold version is fairly vanilla. I got an updated one from here: http://www.vbulletin.com/forum/showp...5&postcount=12

Anyway, here's the change to global.php (this is assuming that you have the very latest version of lierduh's code :) )

There's all sorts of extra markup in the xml for ip ranges and such, but I'm just goign to match against the user-agents, for now.


hotrod1 10-10-2005 12:28 AM

Great hack, thanks alot!

falter 10-10-2005 12:29 AM

lierduh, are you sure that you're diffing a modified 3.5.0 (gold) against the original 3.5.0 (gold) ? I'm having problems running the diffs as patches against the 3.5.0 gold versions of the files (pulled straight out of the tarball).

regardless, perhaps the diffs are the wrong route? Maybe it would do better service to the less technical users if you had some instructions for code modifications that were similar to many of the others hacks? (ex. find [this block of code] add [this chunk of code] right after it, bla bla bla).

Brandon Sheley 10-10-2005 12:32 AM

Quote:

Originally Posted by lierduh
I also have an idea that I should make threads with new posts higher sitemap priority. So a new version is not far away.:)

sounds great, keep up the good work ;)

Unreal Player 10-10-2005 12:50 AM

I do not get your two files at all.

Vtec44 10-10-2005 01:15 AM

Did you chmod 777 your home folder?


All times are GMT. The time now is 10:53 PM.

Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.

X vBulletin 3.8.12 by vBS Debug Information
  • Page Generation 0.01665 seconds
  • Memory Usage 1,925KB
  • Queries Executed 10 (?)
More Information
Template Usage:
  • (1)ad_footer_end
  • (1)ad_footer_start
  • (1)ad_header_end
  • (1)ad_header_logo
  • (1)ad_navbar_below
  • (2)bbcode_code_printable
  • (5)bbcode_php_printable
  • (22)bbcode_quote_printable
  • (1)footer
  • (1)gobutton
  • (1)header
  • (1)headinclude
  • (6)option
  • (1)pagenav
  • (1)pagenav_curpage
  • (4)pagenav_pagelink
  • (1)post_thanks_navbar_search
  • (1)printthread
  • (40)printthreadbit
  • (1)spacer_close
  • (1)spacer_open 

Phrase Groups Available:
  • global
  • postbit
  • showthread
Included Files:
  • ./printthread.php
  • ./global.php
  • ./includes/init.php
  • ./includes/class_core.php
  • ./includes/config.php
  • ./includes/functions.php
  • ./includes/class_hook.php
  • ./includes/modsystem_functions.php
  • ./includes/class_bbcode_alt.php
  • ./includes/class_bbcode.php
  • ./includes/functions_bigthree.php 

Hooks Called:
  • init_startup
  • init_startup_session_setup_start
  • init_startup_session_setup_complete
  • cache_permissions
  • fetch_threadinfo_query
  • fetch_threadinfo
  • fetch_foruminfo
  • style_fetch
  • cache_templates
  • global_start
  • parse_templates
  • global_setup_complete
  • printthread_start
  • pagenav_page
  • pagenav_complete
  • bbcode_fetch_tags
  • bbcode_create
  • bbcode_parse_start
  • bbcode_parse_complete_precache
  • bbcode_parse_complete
  • printthread_post
  • printthread_complete