7.12.2010

Webmaster tools - Robots.txt protocol

Webmaster tools
Robots.txt protocol

When a search engine looks at a web site, it checks for a root directory file named 'robots.txt'. A Standard for Robot Exclusion was established in 1994 which the major search engines observe.  It is a method for excluding certain parts of web sites from being scanned and indexed.   The robots.txt protocol is recommended by Google for use by web masters.  The method is not secure and it does not guarantee privacy.

Google Webmaster Central: Block or remove pages using robots.txt file


Tom Fox
Louisville, Kentucky
Tom Fox on Twitter

No comments: