Robots.txt is used to guide search engine through your website.
if you wish that a specific folder/webpages should not be crawled by an particular search engine. then robots.txt can be helpful for you.Robots.txt can control the excess of any search engine through out your complete website.
Thanks surindersharma for replying to my post
Robots.txt file controls crawler behavior. I am copy - pasting some lines from this article I found on Redalkemi site. It hope it is useful to you
Here are some tips on how to use robots.txt file -
1. The robots.txt file is always named in all lowercase (e.g. Robots.txt or robots.Txt is incorrect)
2. The robots.txt file is an exclusion file meant for search engine robot reference and not obligatory for a website to function. An empty or absent file simply means that all robots are welcome to index any part of the website.
Hi Stephaniemcgrat..........thanks for the nice article. Its really very nicely explained and tips given in this article on redalkemi site, is really excellent
Note that some search engines can not search certain sites unless the robots.txt file is present.