When i was checked the sitemap segment of my webmaster account,it was showing 170 pages were submitted and 159 pages were indexed.Why that remaining 11 pages are not indexed?How to index that particular 11 pages on sitemap?

Recommended Answers

All 7 Replies

Google looked over every file in the sitemap, and discarded those it did not care for.

  • duplicated content(and/or sufficiently similar content)
  • badly formed code that the bot bounced
    possible reasons, not the reasons
    159/170 = 93% : thats good, few sites (no sites) are totally indexed, the other pages will be reached by users if they need them

Search engines use machine coded language and do indexing with the help of spiders or bots. We cannot schedule the indexing process nor can we tell the exact time period. We can just add the total list of links in sitemaps. So, better carry on with your regular SEO work and google would crawl it in near future.

Just wait for sometime since it would take longer for indexing certain pages.

Google looked over every file in the sitemap, and discarded those it did not care for.

The most likely reason :-)

We cannot schedule the indexing process nor can we tell the exact time period.

But you can give search engines a hint by using sitemaps, see: <changefeq>.

Just wait for sometime since it would take longer for indexing certain pages.

Yes, it can take a while to crawl every page, but why would certain pages take longer than others :-s

You can identify exactly which pages a search engine has crawled and when by checking your web server logs. Visits by Google will contain 'GoogleBot' in their User Agent string. User Agent strings sometimes lie but you can discover whether a visit is genuine or not by using a reverse DNS lookup, see: Verifying GoogleBot

There are quite a few things that can prevent pages from getting indexed. Google Webmaster Tools will help you to identify many of these but not all. For example, if Google has found identical content on another site, and decides that is more authoritive than yours, I suspect you won't find the issue reported in GWT.

Is it possible you have some URLs listed more than once in your sitemap?

Well, out of 170, 159 indexed that's good !! And the reason behined some of the pages not indexed is many of them it might be because of duplicacy, pagenotfound, ....... etc !!! But I don't think it going to create any issue.

Google Webmaster Tools and Exact time period do a masterful job of explaining why we are at the start of a new era of radically increasing standards of living throughout the world. Kelly Burby your article is essential reading for anyone looking for a better tomorrow.

google take its ime to index the site if site is good enough and good quality content then its index quickly else take average time.as you said that out of 170 159 page are indexed then it means 11 has not follow the google guide lines or has the duplicate content.you have to check these page and make changes if require depend on requirement

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.