What is robot.txt file and what role it play in the seo. and how can we create it.

Recommended Answers

All 12 Replies

The file lets you request robots not to index certain files.

With relation to SEO, it helps you control which pages get indexed. So you can have pages that have no SEObenefit excluded from indexing. Also, restricting the number of pages that can be indexed increases the chance that your valuable pages will get indexed more frequently.

There are 1,000s of articles on the web to explain how to write one.

Robots.txt file is used to tell the search engines which pages to be indexed by and which pages not to index. Robots.txt is file is used to block the admin pages in the site

Thanks for your value able information, just one more question,

how can i create the text file is there any software for that or manually creates the file thanks

Robot.txt helps you to cover or not to show any files to search engine. It hides it from search engine.

Robots.txt file is a set of instructions that tell search engine robots which pages of your site to be crawled and indexed. In most cases your site is consist of many files or folders i.e. admin folders, cgi-bin, image folder, which are not relevant to the search engines. Robots.txt helps tell spiders what is useful and public for sharing in the search engine indexes and what is not. Robots.txt file is to improve site indexation by telling search engine crawler to only index your content pages and to ignore other pages (i.e. monthly archives, categories folders or your admin files) that you do not want them to appear on the search index.

Robots.txt file is used to tell the search engines which pages to be indexed by and not to index. Means this is the file where we can allow to searching engine what you have to follow or what not.

Create Robot txt file:

Thank you jonywags is solves my problem....

Hi friends, robot.txt file basically used to tell search engine which page crawl and which are not.................Thanks,

What is robot.txt file and what role it play in the seo. and how can we create it.

They retrieved link pages to your website, they are the ones telling the google spiders what to visit and what are not.

it instruct search engines robot to crawl your website

Robots.txt file is used to block any your of a websites which you don't want to get index by Search Engines.

robots are performing the link building.mostly doing the off site

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.