Robots.txt is used to guide search engine through your website.
if you wish that a specific folder/webpages should not be crawled by an particular search engine. then robots.txt can be helpful for you.Robots.txt can control the excess of any search engine through out your complete website.
Robots.txt file controls crawler behavior. I am copy - pasting some lines from this article I found on Redalkemi site. It hope it is useful to you
Here are some tips on how to use robots.txt file -
1. The robots.txt file is always named in all lowercase (e.g. Robots.txt or robots.Txt is incorrect)
2. The robots.txt file is an exclusion file meant for search engine robot reference and not obligatory for a website to function. An empty or absent file simply means that all robots are welcome to index any part of the website.