i analysing my site and found that my site missing dublin core and want to know about it only came to know that dublin core help the meta content but what is the dublin core and how its work its still a mistry for me

Recommended Answers

All 2 Replies

In the opinion of whatever device or site you tested in, some of the elements described at a meeting in dublin in 1995 are missing description of metadata 'dublin' core

not: relevant; current; useful;
  1. the ideas of 1995, are outdated
  2. A development group that does not include google apple or microsoft, is irrelevant

the list of metadata google understands is important

current SEO information is available from your google webmaster tools account
where you will receive individual information on your site from , , , , Google: They make the rules

There are other search engines, that combined make up less than 25% of searches, insignificant, they just follow everything google does.

The idea behind dublin core is pure standardization for cataloging multiple types of resources (including but not limited to webpages and online media such as videos, pictures, etc.). It's debateable how much weight search engines give to websites that utilize dublin core meta tags however the general consensus seems to be that if you had two sites that otherwise would rank in the same position; the website with the addition of dublin core meta data would probably rank one spot higher in the SERPs because of the fact that Dublin Core is considered a minor SEO modifier. Honestly we (Pixelated Karma) now use dublin core on all of our web projects, especially the larger and more complex ones where we implement an internal search engine. This makes developing an internal search engine somewhat a breeze especially on extremely dynamic websites and web apps. However if you are thinking of converting an old site to dublin core; I personnally dont think its worth it.

The reason I say I dont think its worth converting older websites is because unless you have planned a complete re-design of the website, planning a major re-launch, etc. then to convert isn't going to make a significant enough change to your website to be worth it; not when other SEO tactics such as working on increasing social signals and content are shown to be far more valuable to increasing your rank. However if you are utilizing a good template system on that old website then it might be a really fast and easy change to implement it. However remember that this isn't 1995 and that every webpage should have its own description, title, etc. which will complicate the process a bit if you are using a static header templating type system where you cant assign dynamic titles or descriptions to every page - I mean you should have moved away from that sort of system but we are talking about old sites.

On one final note; the purpose of dublin core is to connect a semantically correct internet; not through link building (yes if Dublin Core was widely adopted you could kiss your link building strategies goodbye). The idea of Dublin Core is to skip the links all together and allow crawlers to group similar types of data together/similar subjects of data together/etc. If you have any experience with databases; think of it as if we are converting a noSQL database to an SQL database. The noSQL database being the web as it is today with millions/billions of websites that are only publicly found because of inbound links. The goal of dublin core is to take all the unorganized data (the sites relying on inbound links) and put it into the right table, row, and column in a way that says "All these sites are about web design; they go here!". In the end a system like this will be extremely efficient and effective but its not something that is fully functioning as intended right now in terms of publicly accessible assets on the internet - it really only works with internal and private webpages. By the time webmasters and developers implement this to the point where it achieves its creator's vision; the internet of things will be ruling supreme and the super computers which Google, Baidu, etc. are constantly working on will have the AI to catalogue data alot more accurately, quicker, and more efficently then the semantic web as we see it today - and definitely quicker then trying to ensure that the 16 million + websites which come online in a month are compliant with semantic content.

Case and point - Google has been trying to get people to use H1 tags correctly for 20+ years; how can we force every webmaster to adopt the 15 standard dublin core meta tags across the entire internet?

Hope that answers your question. Let me know if I lost you with my rambling =)

commented: coherent rambling +13
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.