Robots.txt

robots.txt
[http://www.google.com/webmasters/start See what pages aren't crawled by Google. Try Google Webmaster Tools.]

The Web Robots Pages

User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear Disallow: /foo.html
 * 1) robots.txt for http://www.example.com/

User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space User-agent: cybermapper Disallow:
 * 1) robots.txt for http://www.example.com/
 * 1) Cybermapper knows where to go.

User-agent: * Disallow: /
 * 1) go away

User-agent: * Disallow:
 * 1) allow all