View Full Version : Server Status Issues
Black Snow
01-14-2015, 08:47 AM
I currently run a VPS for my forum and I've had a lot of people saying it is really slow. I contacted my host and they gave me access to the server status page. See the attachment.
It seems a google bot is constantly accessing my forum tags and using up all the CPU. Can anyone recommend some suggestions to diagnose further and fix the issue?
RichieBoy67
01-14-2015, 09:35 AM
You can block tags using robots.txt or even set up some crawling delays.
Like this
User-agent: msnbot - replace with what ever bot you want
Crawl-delay: 5
and add this as well:
Disallow: /tags/
There is also a great mod here. Use it to block the bad bots, at least the ones that honor the robots.txt file. https://vborg.vbsupport.ru/showthread.php?t=268208
Black Snow
01-14-2015, 10:14 AM
You can block tags using robots.txt or even set up some crawling delays.
Like this
User-agent: msnbot - replace with what ever bot you want
Crawl-delay: 5
and add this as well:
Disallow: /tags/
There is also a great mod here. Use it to block the bad bots, at least the ones that honor the robots.txt file. https://vborg.vbsupport.ru/showthread.php?t=268208
Thanks. I know about the robots.txt but not all bots take it into account. I have multiple mods for blocking bots too. It's googlebot that is requesting the tags all the time so I can't block it lol
RichieBoy67
01-14-2015, 10:35 AM
Honestly unless you have tens of thousands of them I would not think the bots would be an issue in terms of server resources.
With that said it is strange that Google is ignoring the robots.txt file.
If you can have your host set up fail2ban on your server. Set up the jail for bots and it will temporarily ban heavy bot activity. It can also save you from ddos attacks and heavy probes, etc.
You could also hook into tags_start and make it so that users who are not logged in can not see the page, that would stop the bots from executing the search queries.
RichieBoy67
01-14-2015, 10:41 AM
You could also hook into tags_start and make it so that users who are not logged in can not see the page, that would stop the bots from executing the search queries.
Great idea! I did not think of that!
This would limit guests from being able to use the tags though but seems like a good trade off I suppose.
Black Snow
01-14-2015, 10:45 AM
You could also hook into tags_start and make it so that users who are not logged in can not see the page, that would stop the bots from executing the search queries.
Never thought of that. Good idea. I think I will write something up for it and see if it works.
EDIT: Although this would stop them accessing the tags page, it would not stop them from requesting the page.
Never thought of that. Good idea. I think I will write something up for it and see if it works.
EDIT: Although this would stop them accessing the tags page, it would not stop them from requesting the page.
That's true, but I think the load on your server is caused by the extensive search queries.
I'm sure the bots eventually figure out that they all get the same error and they'll stop visiting the page.
You can always try it out and see if that works. :)
vBulletin® v3.8.12 by vBS, Copyright ©2000-2025, vBulletin Solutions Inc.