To block ALL bots from crawling all pages under a directory use the entry in your robots.txt file:
User-agent: *
Disallow: /private/
To block Google from crawling all pages under a directory use the entry in your robots.txt file:
Continue Reading »