Rcently i posted to many platforms but not sufficiant ansewre was given, can some someone tell me why google has removed my website from search, i have right pagesa and tested on many seo tools everything is fine still google remove my site, does it because of cloud flare, someone said it's because of bad nighbourhood, can any one suggest me the solution my site is https://www.leatherjacketsales.com/

If you check image on Dec 9 2019 here was 43 products and now on 15 Jan 2020 within month almost all site has been deindexed for unknown reason.


Look in the Coverage and Sitemaps section to see if you have pages that have inadvertently been noindexed. If you break your site up into multiple sections each with their own sitemap file, you can narrow it down to which sections Google is not liking.

Way too many reasons.

Yes, there are lots of reasons. But luckily Google Search Console does provide some tools to give us insight into what those reasons might be so that we can get to the bottom of it and correct the problem.

The second article rproffitt linked to suggests exactly what I posted above, which is noindex errors.

But, yeah, the Coverage section of GSC should give you some insight into what all the pages that are indexed have in common and what all the pages that are not indexed have in common.

If you'd like, export the results of the Coverage and Sitemaps reports here so I can take a look myself and come up with something actionable for you to do.

commented: Thanks. I'm still learning about that console. +15

Hi,Thanks for such a fast and valid replies, i have attached coverage screen, actually when i first index this site within fe days i was deindexed , i did index all pages again and this time it stays and i start getting traffic against keywords and actually site take some top position in images section, now suddenly again deindexed , i check the article submited and i should give some attention to web server ordns server cloudflare m still if there is something i should focus you suggest, please help. Thanks again


Hi Ali,

It's the "Excluded" section of that report that would shed some insight.

sorry for typo, and here is sitemap screen.


Nono ...

What I need is for you to click on Coverage and then unclick on "Error" and click on "Excluded" instead.

Also, it looks like you aren't using any sitemaps. You should begin doing that.

When you get a chance, please let me know what that report says. That should answer all your questions :)

Sorry for posting so many times in a row, but just to clarify ...

  • In the left sidebar, click 'Coverage' under the Index section
  • Click 'Excluded' to filter pages excluded and then click 'Error' if it's selected to un-filter
  • For each category in the Details section, click the category, and then take a screenshot of the links it shows

Hi Dani, i think thiese are the stats you want me to share, please check and also tell me wha else i can provide that helps. it also include sitemap states fetch using API.
Let me explain bit more that before i had this website developed in Opencart, but due to same issue of deindexing immediatly after submission, We change it to WordPress and it all URL are redircts mapped corectly, actualamages directory was not removed so old images are still accessable for google and public,.
It perform good for last2 months but again downfall suddenly.

[Click Here





Sitemap is genrated and already sumbitted months ago. some tools said 127 URLs are indexed, and most URLs in inspection tool show as indexed and fetched. but site:leatherjacketsales.com will return now i think 15 or less, also few keywords are in recent in WMT states related to these few URLs only, majorty is gone as URL disappeared.

Looks like the problem is right there ... 1200 pages were removed from the index because there's an alternate page with the proper canonical tag.

What that means is that, when using a CMS, there are often multiple ways to access the same page. For example, the URLs www.example.com/foo.php and www.example.com/foo.php?status=true might land you at the same page with the same content.

Canonical tags are HTML tags that are added to the html head that indicate what the proper URL you want to be indexed is. This way google doesn't have to spin its wheels indexing 500 different versions of the same duplicate content when you can just tell it which the "authentic" page is.

Google had multiple versions of the same page indexed, and it discarded the pages that had indicated that a different URL already in the index was the canonical version.

End result: Google didn't stop indexing any of your content. They just dropped URLs that were duplicates of other URLs already in the index.

Does that make sense?

Way too many reasons. That is, you have your work cut out for you now.

It wasn't that hard after all :)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.