My new website's page showing 63 w3c validation errors. I want to know does this will affect my ranking?

Recommended Answers

All 9 Replies

No, it doesn't effects ranking directly.You need to understand that cleaning your code makes your site render better in browsers, it also makes your page accessible to the people over their portable devices and other which do plays an important role in improving the populartiy and so on.

w3c errors can and will effect your SERPs if they are serious errors. Alot of crawlers have improved to the point that they can overlook alot of minor errors (forgetting to close a tag for example). However if the error is serious enough or there are enough errors sometimes the crawlers will come back with an error and they are not able to properly crawl your website. If this happens you are only being credited with the content which they can crawl which could potentially lead to contextually misplacing your website within that search engine's rankings - ie. not placing you where you truly should be. The best way to see if your errors are effecting your rankings check the crawl errors under Google webmaster tools.

If you are getting incomplete crawls; you should probably start by fixing those errors. If not then move onto other SEO priorities (ie. creating content). However if you are getting incomplete crawl errors then also remember that its going to take a minimum of 3 days before your webmaster tools reflect the change, and thats if you are being crawled daily.

Remember the web is going semantic and has been for a long time - so all it takes is your competitor to have a semantically correct website to beat you out if they have highly competitive content to yours.

commented: Don't know why some person downvoted this post, it is correct +13

No, at least not directly. Agree with Kely!

@pixelated karma I have tried talking with a number of webmasters about it they all seemed to agree that it doesn't affects directly, until or unless the whole page is created in bad manner. So, still my answer stands it doesn't effect directly and I don't think Google has written anything that proves the point that w3c validation will.

Code errors affect your search engine results,
if and when the errors are serious enough to cause display errors in any of the major browsers
.

Very deep in google's member-only forums the rendering algorithms are discussed heavily
SEBots follow the page; if the page is broken, the bots stop at the break.
A simple, fairly accurate, test of how serious errors are is to run the site through browsershots
Serious errors display as blank screens, partial screens, in the displayed screenshots
Fix the code, if blank screens display as a matter of urgency
otherwise fix the code errors in site updates as a maintenance issue.

full list of remediation links are posted as a ReadMe in the Website Reviews forum

Your Site in browsershots works, the code errors are not significant

Thats fine Kelly and you can continue with your current practices but kiss your keyword ranking efforts good bye to some degree. If the crawler terminates the crawl because of errors then your whole page wont be correctly crawled; thus wherever that break is (as almostbob mentioned) is the last part of that page the crawler is going to give you credit for. So thats fine if your errors are found all in the footer section....not so great if the crawlers are stopped before or slightly after the fold which in turn makes all that wonderfully written copy after the termination pretty much pointless in ranking you in SERPs.

@PixelatedKarma now that's not at all friendly I must say ! You seemed to be .. anyway I guess I am going to give my though a try and to some extent I would agree with you but I would still stick to my point that its doesn't affects directly. I would rather look forward over here for some administrative replies like if @dani or @happygeek speak out about it !

The w3c validations can affect and may not affect also, But doing validations will definetly improve your website visibility and this indirectly will improve ranking.

Sorry if you took it the wrong way kelly so let me try a different approach; imagine you were working in php (since most people who are doing SEO nowadays use php based scripts such as wordpress, drupal or joomla I figured this might be the best example). If you are requiring code snippets into your page (versus using includes) and a critical error occurs on one of those requires OR if the require isn't at the location you specify the page cannot load which means the php cannot run so the end user cannot view the page. Now imagine the end user is the web crawler - how can you expect that web crawler to properly index a page which isn't properly functioning?

The same holds true for web crawlers, if they hit a serious error regardless of the web language and they cannot execute the crawl or in some cases can execute the crawl but not complete it, then your website wont be indexed where it should be. This means that everything up to the point of the serious error generally gets crawled but the stuff after doesn't (again, the beautifully written copy occuring past the error doesn't register to the crawler). But again this is for serious w3c errors (in the case of html and css). So minor errors such as the difference between:

<b><p>I am bold</b>I am the rest of the paragraph</p>

and

<p><b>I am bold</b>I am the rest of the paragraph</p>

probably wont make much of a difference however an error like forgetting to have a body tag or a closing head tag can cause some serious mayhem and is just one more point your competitor can have to rank higher than you in the SERPs. And I think we can all agree that SEO is about competition end of the day; my goal, your goal, his goal, her goal is all the same - to out rank each other.

Now what about the minor w3c errors, as an SEO professional, would you rather have this (you should get an error for not having an alt tag if you were to run this in w3c):

<div id="header">
<a href="/"><img src="/images/logo.png" width="300" height="100"></a>
</div>

or this

<div id="header">
<a title="Possible long tail keywords" href="http://www.example.com/"><img src="/images/keyword.png" width="300" height="100" alt="Possible Keyword"></a>
</div>

For contextualizing a link and helping the search engine categorize your website I know personnally I'd rather take the second option which should be w3c compliant and add that little bit of extra help to rank for my target keywords.

In summary the goal in SEO is to be compliant with Search Engine best practices; both Bing and Google have stated that semantic web is a best practice. Now on the same note good content will always trump semantic code which is why error riddled webpages can and still do rank higher at times than semantically correct webpages.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.