0

Dear All,

I want google bot to visit all my web pages. What type of robot.txt code should be used to allow google bot visit all my webpages?

5
Contributors
6
Replies
49
Views
3 Years
Discussion Span
Last Post by DistantGalaxy
1

That would override any disallow directives you may have defined, but only for those web crawlers that observe the Allow directive. Others could still be blocked.

Edited by DistantGalaxy

0

You need a robots.txt file only if your site includes content that you don't want Google or other search engines to index. To let Google index your entire site, don't make a robots.txt file (not even an empty one).

1

Not even an empty one? But what if the site contains content that is not discoverable through the normal link crawling process?

Providing a minimal robots exclusion file with just a sitemap reference, as Dani first suggested, will help to ensure all pages are crawled. Even the ones that aren't linked.

I realize the OP is only asking about Google. Yes, it's possible to notify Google about a sitemap though their Webmaster Tool, but generally speaking it's more practical to use the robots exclusion file. That way you only need to specify it once and every crawler will have the opportunity to discover it.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.