The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#1
|
|||
|
|||
forum is hitting query limits
I logged into my hosting contorl panel this morning and had an error that my datbase is exceeding query limits. This is the message I get:
Quote:
We have a small forum of about 500 users, and aq very small amount of those are actually participating members. What could be causing this to happen? Thanks! |
#2
|
|||
|
|||
1. Get a better host.
2. Disable some of your hacks to cut down on the queries, depending on the hack, it can add 1 or more queries per page load. |
#3
|
||||
|
||||
Or 3. Bring this error to the attention of the host and see if they can change the setting to up the query limit.
|
#4
|
|||
|
|||
Well, I have brought it to their attention in a support ticket a few minutes ago but apparently from their "rules" that I've read in their forum, these are query limits they set on all acounts. Not good.
The only thing I can think is that search bots are causing the query load b/c we get a LOT of search bots and guests. Currently I only have 5 users logged in and 41 guests, many of those guests are bots. We have had as many as 100 search engine bots crawling at one time. Moving to this new host has increased our site performance by 62%. The query restriction limit stinks though. I went in a few minutes ago and disabled a couple of plugins but nothing I see could be causing such a high load. One of them was flashchat but it NEVER has anyone in it, so I turned it off for now. I do have google-analytics enabled. I wonder what kind of load it causes. Thanks Lynne! Buster |
#5
|
||||
|
||||
Can you put in a robots.txt file? I have one on my site so I can limit the amount of yahoo slurp spiders on there at a time. They will come en masse if allowed.
|
#6
|
|||
|
|||
Good idea. What exactly does it need to have in it?
Will that get the MSNbots and googlebots as well? Thanks Lynne! |
#7
|
||||
|
||||
This is a good thread on it - robots.txt help But, there are several to read - type "robots.txt" "titles only"
|
#8
|
|||
|
|||
Wowser, no simple copy and paste from yo0ur robots.txt?
|
#9
|
||||
|
||||
Blocking robots or even delaying them is a really piss poor work around for what shouldn't be a problem. You're only going to hurt your results in the search engines by doing this but if you aren't concerned with such trickery then its a non-issue at that point.
Code:
# Allow Archiver User-agent: ia_archiver Allow: / User-agent: Slurp Crawl-delay: 60 User-agent: GoogleBot Crawl-delay: 60 User-agent: * Disallow: *.php Disallow: *.js Disallow: *.jsp Disallow: *.cfm Disallow: *.asp Disallow: *.html Disallow: *.htm Disallow: *.aspx Disallow: *.cgi Disallow: /forum/includes/ Disallow: /forum/install/ Disallow: /forum/customavatars/ Disallow: /forum/archive/ Disallow: /forum/sitemap/ Disallow: /forum/members/ Disallow: /wp-includes/ Disallow: /wp-content/ Disallow: /wp-admin/ Disallow: /forum/images |
#10
|
||||
|
||||
If you do that search, then you will probably find some pre-written robots.txt files. But, you will have to look it over to see if it is a file that is fit for your site. For instance, some people will add in the members.php page, but that isn't good for sites that want to have the members page indexed. Same with online.php. You need to see if those are files you want blocked from spiders or not.
Edit: The file FRDS posted will disallow spiders from your whole site, not just some pages. Also, he uses a /forum folder and you may have to change that for your site. |
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|