bump...
1. Create a robots.txt to eliminate pages like profiles, search, etc from spiders (This will also help eliminate profiles and other pages from coming up in searches by potential traffic.)
2. Create permissions per forum for unregistered visitors like 'nhawk' described and be careful as visitors will not be able to view these forums either depending on configurations.
3. If you really wish to control which robots/spiders are accessing your forums and utilizing bandwidth, consider this add on. I do however recommend you read all 350 posts in order to guarantee your success. There is a lot to read on user agents even beyond the resources available within the tread in order to properly use this program and ensure you are not shutting out spiders that could help your traffic and cause.
Ban Spiders by User Agent by Simon Lloyd