The file lets you request robots not to index certain files.
With relation to SEO, it helps you control which pages get indexed. So you can have pages that have no SEObenefit excluded from indexing. Also, restricting the number of pages that can be indexed increases the chance that your valuable pages will get indexed more frequently.
There are 1,000s of articles on the web to explain how to write one.
Robots.txt file is a set of instructions that tell search engine robots which pages of your site to be crawled and indexed. In most cases your site is consist of many files or folders i.e. admin folders, cgi-bin, image folder, which are not relevant to the search engines. Robots.txt helps tell spiders what is useful and public for sharing in the search engine indexes and what is not. Robots.txt file is to improve site indexation by telling search engine crawler to only index your content pages and to ignore other pages (i.e. monthly archives, categories folders or your admin files) that you do not want them to appear on the search index.