invitezone
09-05-2012, 05:20 PM
I know that when google spider and baidu etc, try to crawl a forum they dont have permission to enter they only see the permission denied page as seen on who's online.
But the fact that they are seeing the permission denied thing does that mean they will cache the location.
For eg.
I make a section called www.myforum.com/private-section
I make the section hidden from view,
I see google bot trying to crawl it, but seeing the no permission thing.
Now google bot has been denied seeing the contents of that forum, but does google bot now know that the location exists?
I want to make it so that the no-one even bots ever knows a private forum exists within my public forum.
I know I can deny via the robots.txt but some bots choose to ignore that.
How can I make sure of it? is permissions enough?
Thanks for your help.
--------------- Added 1346944839 at 1346944839 ---------------
anyone know this?
thanks
But the fact that they are seeing the permission denied thing does that mean they will cache the location.
For eg.
I make a section called www.myforum.com/private-section
I make the section hidden from view,
I see google bot trying to crawl it, but seeing the no permission thing.
Now google bot has been denied seeing the contents of that forum, but does google bot now know that the location exists?
I want to make it so that the no-one even bots ever knows a private forum exists within my public forum.
I know I can deny via the robots.txt but some bots choose to ignore that.
How can I make sure of it? is permissions enough?
Thanks for your help.
--------------- Added 1346944839 at 1346944839 ---------------
anyone know this?
thanks