vb.org Archive

vb.org Archive (https://vborg.vbsupport.ru/index.php)
-   vB4 General Discussions (https://vborg.vbsupport.ru/forumdisplay.php?f=251)
-   -   How can i ban this bot (https://vborg.vbsupport.ru/showthread.php?t=323511)

RichieBoy67 09-26-2016 04:28 PM

Quote:

Originally Posted by the one (Post 2576274)
Already got it buddy i added these to it below but it dont work

us-west-2.compute.amazonaws.com
compute.amazonaws.com
amazonaws.com
compute-1.amazonaws.com
us-west-2.compute.amazonaws.com
.us-west-2.compute.amazonaws.com


Am i doing something wrong.Many thanks for any help on this.

I think you can just add amazonaws.com or amazonaws and it should work.. That should block all the amazon bots.

Did you add that to your robots.txt file too?

Disallow: amazonaws.com

Kane@airrifle 09-26-2016 04:59 PM

Code:

><a href="http://feed.mikle.com/" target="_blank" style="color:#CCCCCC;"></a><!--Please display the above link in your web page according to Terms of Service.--></div><!-- end feedwind code -->
The above is on an Amazon server. By blocking the those bots, you are blocking your feed.

RichieBoy67 09-26-2016 05:03 PM

Yeah, that's true. It's one or the other.

the one 09-27-2016 10:59 AM

Quote:

Originally Posted by Stratis (Post 2576275)
Please delete this <Files ~ ".*">xxxxxxxx< /Files> and try again.

Code:

order allow,deny
 allow from all

 Deny from us-west-2.compute.amazonaws.com
 Deny from compute.amazonaws.com
 Deny from amazonaws.com
 Deny from compute-1.amazonaws.com
 Deny from us-west-2.compute.amazonaws.com
 Deny from .us-west-2.compute.amazonaws.com


hello that above code wont work the only one that does is the one you gave me before.

Thanks for your time

Simon Lloyd 08-29-2017 05:29 AM

In the ban spiders mod you need to shorten the useragent string to catch more, if you are using my mod then it looks for the entire string that you entered in the list, if you shorten it to say amazonaws it will ban every bot with that in its string.

If you want to ban them via robots.txt you can use this:
Quote:

Originally Posted by Simon Lloyd
Order allow,deny
SetEnvIf Request_URI ^/robots\.txt$ allowall
deny from 23.20.0.0/14 46.51.128.0/17 46.137.0.0/16 50.16.0.0/14 50.112.0.0/16 52.0.0.0/11 54.64.0.0/15 54.66.0.0/16 54.72.0.0/13 54.80.0.0/12 54.144.0.0/12 54.160.0.0/11 54.192.0.0/10 67.202.0.0/18 72.21.192.0/19 72.44.32.0/19 75.101.128.0/17 79.125.0.0/18 87.238.80.0/21 87.238.84.0/23 103.4.8.0/21 107.20.0.0/14 122.248.192.0/18 156.154.64.0/22 156.154.68.0/23 174.129.0.0/16 175.41.128.0/18 175.41.192.0/18 175.41.224.0/19 176.32.64.0/19 176.34.0.0/16 178.236.0.0/20 184.72.0.0/15 184.169.128.0/17 185.48.120.0/22 204.236.128.0/17 216.182.224.0/20
allow from env=allowall


Max Taxable 08-29-2017 04:31 PM

Amazon is a ISP, and the user agent strings you're posting aren't bots per se. They could very well be people you're blocking.

the one 08-29-2017 04:46 PM

Quote:

In the ban spiders mod you need to shorten the useragent string to catch more, if you are using my mod then it looks for the entire string that you entered in the list, if you shorten it to say amazonaws it will ban every bot with that in its string.
Already done that and it makes no difference:(

Thanks for the robot txt i will try that

Thanks

--------------- Added [DATE]1504032582[/DATE] at [TIME]1504032582[/TIME] ---------------

Quote:

Originally Posted by Max Taxable (Post 2589647)
Amazon is a ISP, and the user agent strings you're posting aren't bots per se. They could very well be people you're blocking.

Many thanks i thought that but they all come at once in their hundreds and then suddenly leave.

Its not really a big deal really its just when i am viewing them on whosonline and there are hundreds it just get annoying lol

Max Taxable 08-29-2017 06:34 PM

Quote:

Originally Posted by the one (Post 2589648)
Many thanks i thought that but they all come at once in their hundreds and then suddenly leave.

Could well be because someone linked you on pinterest or some other big site, and people are clicking to check it out.

Quote:

Its not really a big deal really its just when i am viewing them on whosonline and there are hundreds it just get annoying lol
Don't let it bother you. It's not any kind of attack and you're not hacked. It's just internet traffic and it's harmless.


All times are GMT. The time now is 02:03 AM.

Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.

X vBulletin 3.8.12 by vBS Debug Information
  • Page Generation 0.01048 seconds
  • Memory Usage 1,738KB
  • Queries Executed 10 (?)
More Information
Template Usage:
  • (1)ad_footer_end
  • (1)ad_footer_start
  • (1)ad_header_end
  • (1)ad_header_logo
  • (1)ad_navbar_below
  • (2)bbcode_code_printable
  • (7)bbcode_quote_printable
  • (1)footer
  • (1)gobutton
  • (1)header
  • (1)headinclude
  • (6)option
  • (1)pagenav
  • (1)pagenav_curpage
  • (1)pagenav_pagelink
  • (1)post_thanks_navbar_search
  • (1)printthread
  • (8)printthreadbit
  • (1)spacer_close
  • (1)spacer_open 

Phrase Groups Available:
  • global
  • postbit
  • showthread
Included Files:
  • ./printthread.php
  • ./global.php
  • ./includes/init.php
  • ./includes/class_core.php
  • ./includes/config.php
  • ./includes/functions.php
  • ./includes/class_hook.php
  • ./includes/modsystem_functions.php
  • ./includes/class_bbcode_alt.php
  • ./includes/class_bbcode.php
  • ./includes/functions_bigthree.php 

Hooks Called:
  • init_startup
  • init_startup_session_setup_start
  • init_startup_session_setup_complete
  • cache_permissions
  • fetch_threadinfo_query
  • fetch_threadinfo
  • fetch_foruminfo
  • style_fetch
  • cache_templates
  • global_start
  • parse_templates
  • global_setup_complete
  • printthread_start
  • pagenav_page
  • pagenav_complete
  • bbcode_fetch_tags
  • bbcode_create
  • bbcode_parse_start
  • bbcode_parse_complete_precache
  • bbcode_parse_complete
  • printthread_post
  • printthread_complete