You would put it in your root folder - so it would be here - /home/yoursitename/public_html/robots.txt Here's the top portion of mine:
Code:
User-agent: *
Crawl-delay: 40
Disallow: /forums/announcement.php
Disallow: /forums/attachment.php
Disallow: /forums/calendar.php
Disallow: /forums/editpost.php
Disallow: /forums/member.php
So, you need to put the names of the files you don't want the spiders to crawl after the disallow. The crawl-delay works to only allow so many spiders of a certain type (like those yahoo slurp spiders!) onto the site at a time.