![]() |
Is there a way to restrict how often guests can refresh?
I was wondering if there is any add-on that can limit how frequently guests are allowed to refresh? I'd like members to be able to refresh as much as they want, but guests to be limited, regardless of the server load. Thanks for your help!
|
Their is no way to stop someone from refreshing their browser.
|
Leverage browser cache of static content, this way the browser doesn't load the entire KB on refresh. In fact it will load only elements it didn't already encounter on first load.
Example, if your site loads 400kb, on refresh it should only be 1 or 2 percent of that. Because the rest is cached. |
It's actually to ensure people aren't using scripts to scrape our site. We don't want to turn off public access, but we do want people to stop taking content from our site and reposting it elsewhere. Having to track down their host information and file a copyright complaint is getting to be a real time suck.
We'd just like an error to be shown if they refresh more than once every minute, which I know is possible when server load is high (if it's above x then certain membergroups see an error message, while other member groups do not). I'd even be happy if it only updated the page content every minute. |
You need "Ban Spiders by User Agent" then, a good comprehensive list of bad bots is available and contains most of the known content scrapers, and you can add any you see to the list as well.
|
Quote:
|
You get the IP and their user agent string while they are on your site, from the WoL or even the server logs.
But let me get this straight - you want to restrict the reload of all visitors, because you have one person manually scraping content? |
Quote:
I just want to temporarily slow them until we can figure out what's going on. If you have a better idea, I'd be happy to take your advice. :) |
Quote:
You must identify the bad actor and stop IT, not penalize all visitors. You want to slow down your page loading or otherwise restrict visitors, get ready for the hit from google in your search results and pagerank. |
Quote:
|
If you want to stop people from scraping your site, don't put it on the internet.
|
Quote:
Spamgirl, I think Max had an excellent idea... it may take more time to review the logs for certain guests with Paul's mod but if you do it now and find who you think the culprit is, it might help! Remember though that overseas a person can unplug their modem/router and BAM instant new IP address so if they happen to be where that can happen, lets hope they only scrape content and aren't toooooo web savvy :cool:. |
Quote:
Anyhoo, I agree that Max had an excellent idea! Already three IPs are sticking out like a sore thumb, and one of them seems to be the culprit (with a scraper I didn't even know about potentially being a second problem user). Based on their shitty web design skills, I'm hopeful that means they aren't tech savvy at all. :) Thank you all so much for your advice! --------------- Added [DATE]1440343872[/DATE] at [TIME]1440343872[/TIME] --------------- I've found the IPs and tried to block them with .htaccess. I included my own IP in order to test it, but I am still able to access the forum, I just can't see the CSS or images. Here is what I did: order allow,deny deny from ###.#.#. deny from ###.#.#. deny from ###.#.#. allow from all Does anyone know why it would be so wonky? I put it in the main folder of my forum (html1). My site is hosted on EC2, if that matters. I tried it last week and it worked, so I don't know why it wouldn't now... |
Sometimes the truth hurts, but its important to understand the limitations of what you can do. You can ban an ip, but it will probably change and come back.
You can make it so only registered users can view content, but then your search rankings go down. You can make some content pay only, but chances are if its stuff people want someone will steal it, and hopefully they don't do it with a stolen credit card. I do think you should fight, just be ready for the long haul. If they're actually stealing and rehosting your content on their site, you could try a DMCA, but it may or may not work. |
Quote:
|
Is this what you are looking for ?
Limited Guest Viewing -- Motivate Guests to Register |
It's not easy to prevent people from scraping your site.
IP's can be changed/proxies can be used and headers can be spoofed (thus making methods to detect the user-agent useless). There may be one way to stop scrapers, that is to add a JavaScript check to your site before people are able to view your site. CloudFlare does this to prevent certain DDoS attacks. However people could simply just go to your site in a normal browser and save each file individually to their desktop. |
Quote:
|
Quote:
More often than not it just leads to: - More Users leaving your site - Some Users registering just to view content, but not participate. |
Quote:
But for the rest, you're right. All it really does is irritate people. |
It can be considered content cloaking.
|
I'm not actually planning to use it, the Track Guests extension was extremely helpful. I was just thankful that it was suggested :)
|
All times are GMT. The time now is 01:05 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|