hello to All my friends.. i am facing issue related to GSC. the errors are excluded by noindex tag
Page With Redirect, Soft 404, Blocked by Robotstxt, Redirect Error, can anybody tell me or guide me to resolve the issue? how i can resolve these errors?

Firstly, let's understand the errors you're facing:

"Excluded by noindex tag" error means that Google has detected a "noindex" tag on your webpages, which instructs Google not to index the page. If you want Google to index these pages, you should remove the noindex tag from your webpages.

"Page with Redirect" error occurs when Google detects a page that is redirecting to another page. This may happen when a page is moved or deleted and you want to redirect users to a new page. You can fix this error by ensuring that the redirect is set up correctly, and that the new page is accessible.

"Soft 404" error occurs when a page returns a 200 HTTP status code, indicating that the page is found, but the content on the page suggests that the page does not exist or is an error page. You can fix this error by ensuring that your page returns the correct HTTP status code (e.g., 404) for pages that don't exist.

"Blocked by robots.txt" error occurs when Google is unable to crawl a page because it has been blocked by the robots.txt file on your website. You can fix this error by updating your robots.txt file to allow Google to crawl the pages.

"Redirect error" occurs when Google detects a redirect that is not working properly. You can fix this error by ensuring that the redirect is set up correctly, and that the new page is accessible.

Now, let's move on to how you can resolve these errors on your website

Excluded by noindex tag: You can check your webpages for the noindex tag and remove it if you want Google to index these pages.

Page with Redirect: You can check the redirect and ensure that it is set up correctly, and that the new page is accessible.

Soft 404: You can ensure that your page returns the correct HTTP status code (e.g., 404) for pages that don't exist.

Blocked by robots.txt: You can update your robots.txt file to allow Google to crawl the pages.

Redirect error: You can check the redirect and ensure that it is set up correctly, and that the new page is accessible.

To fix these errors, you can use the Google Search Console to identify the specific pages that are affected and make the necessary changes.

After your fixes remember to run a re-index request via GSC or re-submit the entire sitemap

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.