1

We've all seen them -- a user survey, often performed by Respected Independent Firm, but promoted by Vendor Y -- which just happens to come up with the result that users want the products, or the features, that vendor Y provides. Amazing how that works.

Here's the six telltale signs to take such a survey with a grain of salt.

1. Lack of detail about how the participants came to be included. For survey results to be statistically significant, the people need to be randomly chosen. Even if they're all high-powered executives, you don't want them all to be in the same geographic area or industry. (You may even find that all the users just happen to be customers of Vendor Y.) Also, to make sure there's a representative sample, a reputable survey will typically be calling or mailing the user. Surveys that are performed by having users decide to go to a website are what's called self-selected; the people who participate are either doing so because they're really happy -- or they have an ax to grind.

2. Look for the line "Respected independent firm X was commissioned by Vendor Y." That means Vendor Y paid for it. How likely is it that the results are going to be something Vendor Y doesn't like?

3. Meaningless graphs. Look for a graph measuring something impossible to quantify, such as "User satisfaction," typically with no units on the graph, and with the graph forming a perfect line or curve.

4. Lack of detail about the survey methodology. Are the questions listed? Is there a figure for margin of error, confidence level, or standard deviation? (Don't worry about what the terms mean. The important thing is that they mean the survey is statistically valid.)

5. Vagueness in terminology. Look for terms that don't really mean anything, terms the vendor or the survey has invented, multiple terms for similar concepts without a clear explanation of how they differ, and so on.

6. The "Duh" factor. Look for Mom-and-apple-pie issues that of *course* any rational person would support, but that the survey presents in breathless terms.

Once you've read through this, use your favorite search engine to find press releases about surveys, see how many you find -- and see how many publications and websites picked them up. Play it with your friends!

4
Contributors
4
Replies
5
Views
7 Years
Discussion Span
Last Post by InsightsDigital
0

Heh. I remember getting in trouble at the AP for picking up a story from a newspaper that involved some kind of polling/survey but didn't include the margin of error. It was in the early days of my career, and boy did I get reamed, especially since it also didn't say how many people participated, etc.

Now that I'm out of the biz, I might just have to play the internet game on surveys. Thanks. :-)

0

normally i would say taking surveys is nothing but a big BS... there are other mediums to make lotta money online with lot of scope

0

This is a great post because it touches on certain issues that are over looked in today's "Let's push the Numbers" mode. First, the data is only relevant for the represented sample. This reminds me the days when clinical trial results were advertised on commercials denoting safety and efficacy, mind you, the trial was only run on healthy males.
Second, remember - junk in is junk out. That means if data is not cleaned and sliced and diced appropriately, it will affect the quality and veracity of the results.
Third, The Vendor's goal is to push the product so be very cautious about the survey results....

Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.