Got your attention? That might be a stretch, but one thing is for certain, figures might not lie, but liars figure (at least according to Mark Twain). The problem is, it’s remarkably easy to take legitimate numbers and spin them to mean whatever you want them to mean. It’s even easier to pose poll questions in such a way as to push the subjects to give answer that you want. That makes surveys, polls, and questionnaires…well…questionable. Think about it. How many times have you been asked a question, and you gave a less than truthful answer? If you’re like most people, the answer is “a bunch.” Turns out, most people answer survey questions with an eye to what they think the people running the survey want to hear. Others simply don’t care. Some like to attempt to skew the results, just for fun. Then again, some take surveys seriously. Problem is, it’s almost impossible to tell the accurate data from the bad. Oh, sure, you can try making the surveys double-blind, expand your sample to eliminate statistical aberrations, even try and cloak the party who’s commissioned the survey. Those techniques can help – a little – but they cannot overcome the central reason that surveys are inaccurate.
It’s true. (At least as far as you know.) Everybody lies from time to time. It’s human nature. To get someone to tell the truth – and for you to know if they are – is the holy grail for pollsters. Short of Diana Prince’ lasso, there’s no 100% certain way to make sure that your respondents are being straight with you.
What does this mean to your marketing? Well, as for me, I tend to take surveys with a certain degree of skepticism. If they can pass my “sniff” test, I will use them, but only as one facet of market research. Frankly, I think if you can get data that shows behavior (especially behavior that equates to voting with wallets), you’re a great deal closer to knowing what people think, than you would be by relying on surveys alone.
Here’s a great case-in-point. I used to work for a software company. We conducted a survey of registered users that asked what features they’d like to see added to our software. The data we received was…odd. The distribution of responses was a little too spread out for us to determine a consensus. The company finally went to my department – Creative Services – and asked us what we thought. We had the benefit of using not only our own products for work (we made graphic design software), but also the competition’s stuff. The key was, in looking at both our products and the competitors, we had a better idea of what was possible, than did our customers who only looked at things from the perspective of those who used only our products.
The bottom line was that we were able to act as power user consultants for the development group, giving them some valuable input. To skew things even further, though, a couple of developers came up with ideas nobody else had thought of – ideas that proved to be groundbreaking when implemented as features in our products.
If the company had relied solely on surveys, we’d have released a product that would have been not nearly as cool and useful as the one we shipped. Since we used those surveys as one facet of our research, we ended up with a stronger product.
So the next time you look at data and hear someone pontificating about how the data makes unassailable their contentions regarding the “facts,” take a step back and remember this:
70% of those surveyed agreed:
There are two kinds of people in this world…
Those that seperate people into two categories…
And those that don’t.
Or words to that effect.