The Arcive of Official vBulletin Modifications Site.It is not a VB3 engine, just a parsed copy! |
|
#1
|
|||
|
|||
Is there a way to restrict how often guests can refresh?
I was wondering if there is any add-on that can limit how frequently guests are allowed to refresh? I'd like members to be able to refresh as much as they want, but guests to be limited, regardless of the server load. Thanks for your help!
|
#2
|
||||
|
||||
Their is no way to stop someone from refreshing their browser.
|
#3
|
||||
|
||||
Leverage browser cache of static content, this way the browser doesn't load the entire KB on refresh. In fact it will load only elements it didn't already encounter on first load.
Example, if your site loads 400kb, on refresh it should only be 1 or 2 percent of that. Because the rest is cached. |
#4
|
|||
|
|||
It's actually to ensure people aren't using scripts to scrape our site. We don't want to turn off public access, but we do want people to stop taking content from our site and reposting it elsewhere. Having to track down their host information and file a copyright complaint is getting to be a real time suck.
We'd just like an error to be shown if they refresh more than once every minute, which I know is possible when server load is high (if it's above x then certain membergroups see an error message, while other member groups do not). I'd even be happy if it only updated the page content every minute. |
Благодарность от: | ||
TheLastSuperman |
#5
|
||||
|
||||
You need "Ban Spiders by User Agent" then, a good comprehensive list of bad bots is available and contains most of the known content scrapers, and you can add any you see to the list as well.
|
#6
|
|||
|
|||
The problem is that it's a single person scraping our site for their own, and I don't know their IP, otherwise I'd just ban them.
|
#7
|
||||
|
||||
You get the IP and their user agent string while they are on your site, from the WoL or even the server logs.
But let me get this straight - you want to restrict the reload of all visitors, because you have one person manually scraping content? |
#8
|
|||
|
|||
Quote:
I just want to temporarily slow them until we can figure out what's going on. If you have a better idea, I'd be happy to take your advice. |
#9
|
||||
|
||||
Quote:
You must identify the bad actor and stop IT, not penalize all visitors. You want to slow down your page loading or otherwise restrict visitors, get ready for the hit from google in your search results and pagerank. |
2 благодарности(ей) от: | ||
spamgirl, TheLastSuperman |
#10
|
|||
|
|||
Quote:
|
Thread Tools | |
Display Modes | |
|
|
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
More Information | |
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|