Robots.txt

To help search engines determine how the content on your website should be indexed, you can provide a robots.txt file.

In most cases the default robots.txt will work fine for you - where search engines are told to index the content on Live, but not in other environments (e.g. Demo).

This is not a security feature, and should not be used to protect private information. For that you might want to look at the user helper, or if you have a Demo site, just simply add a HTTP Auth.

To add your own robots.txt, simply create the file:

/app/library/setup/robots.txt

Or:

/app/library/setup/robots-stage.txt
/app/library/setup/robots-demo.txt
/app/library/setup/robots-live.txt