5 questions you should ask before believing a survey result.

5 questions you should ask before believing a survey result.

My, surveys are very popular in the world of lingerie, aren't they? And they get taken very seriously. I find this pretty laughable because I spent years training in the art and science of surveys. Doing one properly - getting the right people for it, asking the right questions to test your ideas, wording questions properly so they say what you mean and are understood, collecting the results sensibly, representing the results in a way that shows them accurately - these are all things that are actually relatively challenging. So next time you see yet another all-singing-all-dancing-press release on a survey, here's five questions you might want to ask about it before taking it as the whole truth.

I don't have anything relevant and legal to illustrate this with, so I'm just going to use our landscape images as a blatant way to alleviate the text with pictures. Because if there is one reliable survey result, it's that people like pictures!

1) Who ran the survey?

Do they have a vested interest in the result? I wave a cynical eyebrow in the direction of any survey that comes up with a result that benefits the surveyors. Whether that's customer research by lingerie companies who specialise in bra fit and magically establish that most of us wear the "wrong" size and need to go to a store and get fitted, or pressure groups who mysteriously find that the exact thing they're focussed on is a major problem for the majority of people who answer. Even surveys that might count as science can end up very flawed - depending on who is funding them and the bias of the researchers involved. Everybody has a bias!

It's very easy for us to produce the results we want without even knowing that we're doing it. That's just how people are wired. If you're being paid to do so it's even easier. You can play a game here that's great for showing how this works.

 

2) Who completed the survey?

The easiest way to get people to do a survey is to stick it online and tell people you know. This method is great to find out if your friends prefer bowling to the cinema, or if you're a brand who is only interested in their own customers.

It's a great way of making everything else go wonky, because people who fill out surveys online tend to have some reason for doing so - a vested interest in the topic, more free time than average, because they want the academic credits/gift voucher/whatever other bribery has been involved, but basically, volunteers are odd. You can magnify this effect enormously simply by failing to talk to anyone outside of your own interest groups about your survey, and by making the survey loooooooong. Making it long pretty much guarantees that only people with highly unusual lifestyles or an unusual focus make it all the way to the end.

I recently saw a survey done in the maker community (if you haven't run across that, it's basically technological DIY. Can we hack it? Yes we can! They like raspberry pi's, fixing stuff, and robots). Anyway, I got asked to take a look and 15 questions in realised 97% of the respondents identified as men. OK, the maker community is gender imbalanced - but not THAT imbalanced. The only time you should get a result like that is if you're surveying products designed very specifically on a gender basis, or issues that affect people on a gendered basis, like testicular cancer.

In their case, it looks like they made no effort to get women or girls to fill in their survey, and then also made it about a million pages long. Only really committed, invested, obsessive folks on an issue will fill in a very long survey on a topic, and they tend to represent only a few percent of a community or population.

You should also wave a leery eyeball at any survey filled in by less than 1000 people if it purports to represent an entire country, or god help us, all women everywhere, especially if no effort has been made to make it match the groups within said country.

 

3) How did they word the questions?

It's really easy to throw the results of a survey by wording questions in a biased way.

Words don't mean the same thing to every - my favourite example being the "how many people have you had sex with" surveys, because you'd be surprised by how differently people define sex. Think of Bill Clinton, who genuinely thought oral didn't count, as far as we can tell.  Of course, people are also worried about how they present themselves, and also most people like to get the answers "right", so basically you can mess with all of this to get the results you want without even knowing you're doing it.

Changing the type and order of questions you ask and you can lead up to a question you want to report on and get the result you want fairly easily. There's a lovely demonstration of this, called "leading questions" in Yes Minister, a very old British comedy about government. Or read the transcript here.

Sticking in a few questions that people just hate to answer is also a good trick. In Britain it's unbelievably rude to ask how much people earn, but of course any consumer research really, really wants to know. Make people have to answer that question in order to complete the survey and you've just guaranteed yourself hat only unusual people will answer your survey.

 

4) How do they show the results?

If someone won't tell you how they got their survey answers, or what the questions were, definitely show some scepticism. Also, apply some critical thinking to how they represent the results. For example, if I say to you that 90% of women prefer high waist knickers to thongs, that sounds pretty impressive. If you know that I got that figure from the last 10 people who bought from our website, where we only sell about two thongs and have about 60 styles of high waist knickers, well, that starts to look like a very small sample with a very biased background.

Similarly, when I hear that 52 women who responded to a survey wanted a particular type of bra... I just wonder, out of how many? KMD has 6000 customers. 52 wouldn't be enough to make a whole line of bras; it's less than one-tenth of one percent. On the other hand, 52 people who wanted their local deli to add a new sandwich flavour - totally worth a try. Context is everything where numbers are concerned.

If someone shows you the results as a 3D donut diagram there is something absolutely filthily wrong with their data and they are trying to hide it behind fancy graphics. Seriously, that format is so wrong it's banned by professionals!

 

5) Does the headline match the data?

The way people report survey or research survey results can be very different to what the actual survey or its results involved.

"Female horseriders who fail to wear sports bras could be stressing animals" says the Telegraph. A health psychologist friend (who also happened to be a horse rider) and I spent a happy half hour shredding this together last week. This research was a survey, done online, with some extremely dubious methodology (we all know that cup size is relative to back size, has no meaning on its own, and that there's no consistent agreement on what sizes actually are, right? So divvying up your respondents by cup size is a bit... interesting). It said nothing about horses, as far as we can tell the Telegraph just added that in for fun. When we dug into the stats, of the women who responded (who are more likely to have responded because they have breast pain, because why would you do the research if you have no issues?), it seems like it was about 13% who said that they got breast pain they thought was related to just riding rather than pre-menstrual or menstrual breast tenderness or pain (naturally they don't seem to have used those words, going for the rather vague "hormone related pain" instead. Plus, it turned out that pain mostly happened when trotting. The links to sports bras or indeed horses feelings are, well, tenuous at best.

We prefer this interpretation of the results:

"If you've got PMS, it's better to gallop everywhere than trot in a lady-like manner"

Seen any lingerie stats you love recently? Anything you've debunked yourself? Let me know in the comments!