PDA

View Full Version : How can i ban this bot


the one
09-23-2016, 04:28 AM
latley on my forum i have been having hundreds of bots from ec2-54-70-216-122.us-west-2.compute.amazonaws.com

When you view whosonline and display user agent this is what it says below

ec2-52-89-87-158.us-west-2.compute.amazonaws.com
Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36

here is another one

ec2-54-70-213-207.us-west-2.compute.amazonaws.com
Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36

I have tried banning them with my ban spiders plugin and i have also used my ip deny manager to ban numerous ip ranges but they keep coming back in their hundreds.

Is there a way to completley ban this bot.This is what i added to my spider list below.

us-west-2.compute.amazonaws.com
compute.amazonaws.com
amazonaws.com
compute-1.amazonaws.com
us-west-2.compute.amazonaws.com
.us-west-2.compute.amazonaws.com

Am i doing something wrong.Many thanks for any help on this.

Stratis
09-23-2016, 08:01 AM
Put this in your .htaccess file.


<Files ~ ".*">
order allow,deny
allow from all

Deny from us-west-2.compute.amazonaws.com
Deny from compute.amazonaws.com
Deny from amazonaws.com
Deny from compute-1.amazonaws.com
Deny from us-west-2.compute.amazonaws.com
Deny from .us-west-2.compute.amazonaws.com

</Files>

the one
09-25-2016, 05:32 AM
Put this in your .htaccess file.


<Files ~ ".*">
order allow,deny
allow from all

Deny from us-west-2.compute.amazonaws.com
Deny from compute.amazonaws.com
Deny from amazonaws.com
Deny from compute-1.amazonaws.com
Deny from us-west-2.compute.amazonaws.com
Deny from .us-west-2.compute.amazonaws.com

</Files>

mmmm once i add that code it does work but there is a problem.If you go to my home page here http://jandeane81.com/index.php and look on the right where it says theonetruth news you will see it working perfectly.Once i add that code above to my htaccess it stops its preventing my feeds from displaying.I created that widget from http://feed.mikle.com/

So is there a workaround

many thanks

Seven Skins
09-25-2016, 09:22 AM
Try this may be it will work.

<IfModule mod_setenvif.c>

SetEnvIfNoCase User-Agent ^us-west-2.compute.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^compute.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^compute-1.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^us-west-2.compute.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^.us-west-2.compute.amazonaws.com bad_bot

<Limit GET POST PUT>
Order Allow,Deny
Allow from all
Deny from env=bad_bot
</Limit>

</IfModule>

the one
09-26-2016, 11:39 AM
Try this may be it will work.

<IfModule mod_setenvif.c>

SetEnvIfNoCase User-Agent ^us-west-2.compute.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^compute.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^compute-1.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^us-west-2.compute.amazonaws.com bad_bot
SetEnvIfNoCase User-Agent ^.us-west-2.compute.amazonaws.com bad_bot

<Limit GET POST PUT>
Order Allow,Deny
Allow from all
Deny from env=bad_bot
</Limit>

</IfModule>

Hi this code dont work the other one did but then i had that problem with my feeds not showing on home page.Thanks for your time

RichieBoy67
09-26-2016, 01:27 PM
Try this:

https://vborg.vbsupport.ru/showthread.php?t=268208

the one
09-26-2016, 02:00 PM
Try this:

https://vborg.vbsupport.ru/showthread.php?t=268208

Already got it buddy i added these to it below but it dont work

us-west-2.compute.amazonaws.com
compute.amazonaws.com
amazonaws.com
compute-1.amazonaws.com
us-west-2.compute.amazonaws.com
.us-west-2.compute.amazonaws.com


Am i doing something wrong.Many thanks for any help on this.

Stratis
09-26-2016, 02:07 PM
Please delete this <Files ~ ".*">xxxxxxxx< /Files> and try again.

order allow,deny
allow from all

Deny from us-west-2.compute.amazonaws.com
Deny from compute.amazonaws.com
Deny from amazonaws.com
Deny from compute-1.amazonaws.com
Deny from us-west-2.compute.amazonaws.com
Deny from .us-west-2.compute.amazonaws.com

the one
09-26-2016, 02:49 PM
..... i will report back many thanks

Stratis
09-26-2016, 02:57 PM
I stopped using RSS two years ago, but i don't remember having this issue.
Hope some expert will find a solution :)

RichieBoy67
09-26-2016, 04:28 PM
Already got it buddy i added these to it below but it dont work

us-west-2.compute.amazonaws.com
compute.amazonaws.com
amazonaws.com
compute-1.amazonaws.com
us-west-2.compute.amazonaws.com
.us-west-2.compute.amazonaws.com


Am i doing something wrong.Many thanks for any help on this.

I think you can just add amazonaws.com or amazonaws and it should work.. That should block all the amazon bots.

Did you add that to your robots.txt file too?

Disallow: amazonaws.com

Kane@airrifle
09-26-2016, 04:59 PM
><a href="http://feed.mikle.com/" target="_blank" style="color:#CCCCCC;"></a><!--Please display the above link in your web page according to Terms of Service.--></div><!-- end feedwind code -->

The above is on an Amazon server. By blocking the those bots, you are blocking your feed.

RichieBoy67
09-26-2016, 05:03 PM
Yeah, that's true. It's one or the other.

the one
09-27-2016, 10:59 AM
Please delete this <Files ~ ".*">xxxxxxxx< /Files> and try again.

order allow,deny
allow from all

Deny from us-west-2.compute.amazonaws.com
Deny from compute.amazonaws.com
Deny from amazonaws.com
Deny from compute-1.amazonaws.com
Deny from us-west-2.compute.amazonaws.com
Deny from .us-west-2.compute.amazonaws.com


hello that above code wont work the only one that does is the one you gave me before.

Thanks for your time

Simon Lloyd
08-29-2017, 05:29 AM
In the ban spiders mod you need to shorten the useragent string to catch more, if you are using my mod then it looks for the entire string that you entered in the list, if you shorten it to say amazonaws it will ban every bot with that in its string.

If you want to ban them via robots.txt you can use this:Order allow,deny
SetEnvIf Request_URI ^/robots\.txt$ allowall
deny from 23.20.0.0/14 46.51.128.0/17 46.137.0.0/16 50.16.0.0/14 50.112.0.0/16 52.0.0.0/11 54.64.0.0/15 54.66.0.0/16 54.72.0.0/13 54.80.0.0/12 54.144.0.0/12 54.160.0.0/11 54.192.0.0/10 67.202.0.0/18 72.21.192.0/19 72.44.32.0/19 75.101.128.0/17 79.125.0.0/18 87.238.80.0/21 87.238.84.0/23 103.4.8.0/21 107.20.0.0/14 122.248.192.0/18 156.154.64.0/22 156.154.68.0/23 174.129.0.0/16 175.41.128.0/18 175.41.192.0/18 175.41.224.0/19 176.32.64.0/19 176.34.0.0/16 178.236.0.0/20 184.72.0.0/15 184.169.128.0/17 185.48.120.0/22 204.236.128.0/17 216.182.224.0/20
allow from env=allowall

Max Taxable
08-29-2017, 04:31 PM
Amazon is a ISP, and the user agent strings you're posting aren't bots per se. They could very well be people you're blocking.

the one
08-29-2017, 04:46 PM
In the ban spiders mod you need to shorten the useragent string to catch more, if you are using my mod then it looks for the entire string that you entered in the list, if you shorten it to say amazonaws it will ban every bot with that in its string.

Already done that and it makes no difference:(

Thanks for the robot txt i will try that

Thanks

--------------- Added 1504032582 at 1504032582 ---------------

Amazon is a ISP, and the user agent strings you're posting aren't bots per se. They could very well be people you're blocking.

Many thanks i thought that but they all come at once in their hundreds and then suddenly leave.

Its not really a big deal really its just when i am viewing them on whosonline and there are hundreds it just get annoying lol

Max Taxable
08-29-2017, 06:34 PM
Many thanks i thought that but they all come at once in their hundreds and then suddenly leave.Could well be because someone linked you on pinterest or some other big site, and people are clicking to check it out.

Its not really a big deal really its just when i am viewing them on whosonline and there are hundreds it just get annoying lolDon't let it bother you. It's not any kind of attack and you're not hacked. It's just internet traffic and it's harmless.