Hi folks,

There is a popular myth about duplicate content, and that is that Google penalizes sites for having duplicate content.

Greg Grothaus of the Search Quality Team of Google cleard that Google itself is not penalizing you for it.

Google recognizes that most duplicate content is not created to be deceptive. There are of course exceptions, which are considered spam. They're being penalized for being spam but not for having duplicate content.

There are some issues that can arise that may negatively affect your rankings.

1) Your link popularity will be diluted. Backlinks pointing to several different URL versions(like domain.com, www.domain.com, domain.com/index.htm) of the same content, will make it harder to accumulate link juice for one URL.

2) User-unfriendly URLs in search results may offset branding efforts and decrease usability as well.

3) With multiple versions of the same thing, Google will spend more time crawling the same content, meaning it will have less time to go deeper into your site, and you run the risk of having content not get indexed.

More interesting reading at webpronews.com/topnews/2009/09/16/google-busts-the-duplicate-content-myth

Recommended Answers

All 6 Replies

You just posted the same link that sam09 did?

Also, sam09 is talking about the same content under multiple URLs on the same page. Duplicate content is usually revolving around the same article or content across multiple websites (aka plagiarism).

Hi folks,

There is a popular myth about duplicate content, and that is that Google penalizes sites for having duplicate content.

Greg Grothaus of the Search Quality Team of Google cleard that Google itself is not penalizing you for it.

Google recognizes that most duplicate content is not created to be deceptive. There are of course exceptions, which are considered spam. They're being penalized for being spam but not for having duplicate content.

There are some issues that can arise that may negatively affect your rankings.

1) Your link popularity will be diluted. Backlinks pointing to several different URL versions(like domain.com, www.domain.com, domain.com/index.htm) of the same content, will make it harder to accumulate link juice for one URL.

2) User-unfriendly URLs in search results may offset branding efforts and decrease usability as well.

3) With multiple versions of the same thing, Google will spend more time crawling the same content, meaning it will have less time to go deeper into your site, and you run the risk of having content not get indexed.

More interesting reading at webpronews.com/topnews/2009/09/16/google-busts-the-duplicate-content-myth

Will Google even track the contents that are duplicate but edited in the way that they will not be recognized by bots and if someone posts the content but gives credits to original content, will Google still penalize the website?

The appropriate way to reproduce content is to use the <blockquote> html tag, and use <blockquote cite="original source url"> to give credit. This tells Google that your content is reproduced and so if someone does a Google search, show the original content and not yours (so you won't really get any Google traffic off of a page with a reproduced article) but you don't get any type of penalty for duplicate content yourself either. And that's a good thing.

You just posted the same link that sam09 did?

Also, sam09 is talking about the same content under multiple URLs on the same page. Duplicate content is usually revolving around the same article or content across multiple websites (aka plagiarism).

Sorry, So my post can be considered as duplicate content. :)

Appropriately enough, DaniWeb uses <blockquote cite=""> for all

bbcode, where the cite URL is the permalink to the individual post you cited.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.