Since I am in Japan, I have no experience with the changes made by Google.
But a simple collection of keywords may have worked before to get traffic, but now, quality will have to be given importance above quantity. Not only the answers, but also the questions. I rarely wander out of the C/C++ forums, and there I think 90% of the questions are beginner level questions. Even then, considering the simplicity of the question, the depth of the thread is ridiculously too long. It takes at least 3-4 posts to get the OP to post some code. Developers don't have time to wade through dozens of pages to find a solution. It gets boring and wastes time. That is why Wikipedia is in the top of the search results. It may not have the complete answer, but it is a great starting point for anyone who is not a free loader. I for once haven't come across Daniweb during my work related Google searches. My main hits have been Wikipedia, Stackoverflow, Embedded.com etc. And they point me in the correct direction.
I read in a previous thread, that Dani has made changes targeting more advanced developers from the US/CANADA to post here more, so things will improve. But it won't be easy to reverse the damage the broken English homework students have done in these past 5-6 years.
Moderators will have to pay a lot of attention on the quality of the answers and accuracy too. Right now, community-wise, regardless whether it helps the OP or not, long threads with as many users contributing as possible are preferred and even encouraged. But developer wise, unless it is a really technical discussion, I think it is bad quality. So maybe the overall site policy may have to evolve a bit.
The main issue is that this algorithm change affects 'sites', not 'pages'. Therefore, it is not the case that poor quality pages simply rank less. Instead, Google's algorithm appears to have deemed the entire daniweb.com domain a useless "content farm", and therefore demoted our rank in the engines for every single keyword any page on the site ranked for. It's like taking an instant -20 off of an assignment before it even gets graded because the student has a bad track record. Let the individual assignment speak for itself.
I have been doing random searches throughout the day for Microsoft technologies I have proficiency in, and not once did DW show up on the first page of searches. I rarely ever venture on to the 2nd or 3rd pages of a search. In almost all the searches, MSDN forums have been most prevalent, along side a few random article sites, and "4 guys from rolla". Most of these sites are good sites for what I'm looking for, but they don't always give a complete answer. One more reason why I'd like to start writing tutorials.
As for the broken english homework students, I refuse to answer their questions. I'd rather help them to find the answer for themselves. You won't learn if the teacher always does it for you. This is the approach that has helped me the most, and I think it will help others.
One change I wouldn't mind seeing is this idea addressed a little more directly in the moderation of the forums, even listing it explicitly in the rules. Though it may be there, it's been awhile since I've read them.
Well, the change did indicate that duplicate content would be punished and unfortunately tech help forums do end up answering many of the same questions over and over again. New students roll into the same old curriculum and we end up with 500 more threads about inventory programs.
I don't see how any large and long-running forum could avoid such duplication on many key words.
That is not considered duplicate content, Ezzaral. In fact, it is a huge boost to SEO to have duplication of many keywords throughout a site. It demonstrates to Google that the site is an "authority" on that keyword/topic.
"Duplicate content" refers to when multiple websites have the *exact* same content. i.e. the same article has been posted to multiple websites, or one website syndicates the RSS feeds of another, etc. Search engines frown upon this because if you do a search for something, and there ends up being a relevant article, you don't want the entire first page of Google results to be that exact same article (regardless of how great and relevant and useful it is) but hosted on a different domain. The main problem Google has with duplicate content is determining which source is the original and which website copied which. Publishers get frustrated with Google when they put money and resources into having writers put out great content. Then, some mashup website scrapes their content, reposts the article, and they show up in the Google results instead of the original publisher.
I agree 100% with WolfPack. If we can design a way to change the question into something sensible, I'm sure the search reply from Google will benefit us more. I have for instance searched "Petition For Help" and it came out second on the first page of Google. If the question would have read something like "Help us to get more hits", it will not even show on the first page.
If the questions can be "manipulated" the search result will obviously move us up the list. My 2 cents worth.:)
"Petition for Help" came up on the first page because Google still loves DaniWeb where you live (You're not in the United States). The problem is that it isn't about individual page titles. Google has actually blanketed the entire daniweb.com domain with a penalty in the United States.
Well, from how I see it, if the users find the new Google results push the more informative content down the list, they will move to a better search engine. I remember using Yahoo search as my primary search engine in the early 2000s but moving to Google for the same reason. Let's see what happens. It is the users who decide finally. Content doesn't get better just because it figures in the top of some machine generated list.
I've been doing various tech searches today for work, and I've been making it a point to use multiple engines but keeping the same search terms, and DW isn't showing up very much at all on any search engine.
I noticed on Alexa that Daniweb's Traffic Rank improved by 533 positions yesterday. Some of the other numbers were up by significant amounts as well. That seems like a pretty significant improvement in one day. Is this an indication that Google is adjusting the algorithm and things are heading back towards something closer to 'normal'?
I posted the following in the above forum this morning...
I understand that recent changes to google search (supposedly in order to reduce the ranking of content farms) has also harmed legitimate sites such as www.daniweb.com. This has apparently reduced traffic to the site by 50%. I have found this site to be a high quality source of technical help across a wide range of disciplines and would hate to see it die because it somehow got caught in the net. Surely with the technology available some kind of exception list could be added to google search such that the quality sites like daniweb that are mistakenly identified as content farms are not treated unfairly. I contribute regularly to the forums in my areas of expertise and likewise receive help in areas in which I am less proficient. Please give this matter serious consideration. The loss of daniweb would be felt by many.