The 6 Worst Market Research Mistakes #MRX

3d chart mistake

Sigh, not only are the colours horrid, this 3d pie chart completely misconstrues the values it is trying to represent. Would you have guessed the three slices are identical?

As much as we’d like to think, market research is not a commodity. Though anyone can technically carry out the processes that can be deemed to be market research, not everyone can carry out those same functions and actually be conducting valid and reliable market research.

The problem is that so many problems can and do crop up along the way. Unless you can 1) actually recognize that there is a problem and 2) have the skills to actually fix the problem, your problem will just compound itself. Here are the top problems that I see.

  1. Not starting with a research objective. If you don’t know the questions you are trying to answer, you will waste many hours wandering in circles, playing with numbers, and accomplishing absolutely nothing. Coming up with cool results does you no good if it doesn’t solve the problem you were initially trying to solve.
  2. Using insufficient sample sizes. Forget the fact that insufficient samples sizes won’t generate any statistically significant results. I’m not concerned with signficance here. I’m concerned with trying to solve major problems based on only 100 responses. How the heck are 100 responses reflective of any group of people, unless the target population is only 500 to begin with. How the heck can you analyze subgroups of men and women, or older and younger people, if you’re only starting with 100 people. Did you not anticipate wanting to look at subgroups of people?
  3. Being bound by statistical significance, or lack thereof.  We often forget about type 1 and type 2 errors. Any time you do statistical tests, some will be falsely significant and others will be falsely insignificant. What the means is that statistics will help guide you but they aren’t the be all and end all of what it important in your dataset. You absolutely must depend on your brain to determine what the important results are.
  4. Generalizing beyond your sample. In its worst form, this means gathering results from 100 women and assuming the results will apply to men as well. Or, generalizing results from your subsample of 5 men to the entire male population. How about generalizing results from people who completely a two hour survey to people who’ve never answered a survey in their entire life. Again, what were you thinking? You  must realize ahead of time that you care about what men think or what non-robots would think.
  5. Creating something out of nothing. Surprise, surprise, the business world is indeed a publish or perish world. If you don’t publish surprising and interesting results in your research report, clients may be less likely to consider you as a vendor as you clearly don’t have the skills to find the surprising and interesting results. Alas, this philosophy should never lead you make a mountain out of a molehill simply so you have something cool to show your client. This is just another form of falsifying data. You will get caught. You will be horribly embarrassed.
  6. Focusing on entertaining, not educating. I’ll say it. Storytelling is a huge fad right now. If you don’t turn your research results into a story and delight and amaze your audience, your client may be hugely disappointed. But if your focus is on telling a pretty story instead of discovering whether there actually is a story, you are once again succumbing to falsifying data.

Remember, the research must come first. Decide on your research objective, build a great research methodology using the right sample sizes, the right scales, the right instruments, the right techniques. Analyze the data properly, thoughtfully, logically. If indeed there is a story worth telling, it will be done with integrity and validity. That’s the kind of story I want to hear.

11 responses

  1. […] listed as one of the six worst mistakes of marketing research according to one marketing research methodologist’s blog, that is getting bound by statistical significance. It is just so simple, agreeable, and […]

  2. Love your point on statistical significance. Seems like a big one.

  3. A good point 8 here is researchers (or people thinking they are researchers) trying to ram square pegs into round holes. A particular issue which we come across every day is the belief (or just pure lack of thought) that people like sitting at home and filling in online surveys that last 30 minutes long – and will answer with consideration and accurately for all of this time. Online surveys have a huge part to play in data collection but they should be used with consideration for 1) the accuracy of the data you want to receive and 2) for the people completing the surveys.

  4. You missed out no.7 – not proof reading your work properly!! What were you thinking? And have you heard of the Law of Large numbers and error margins?

  5. I am constantly amazed at the number of times I see market research agency reports with graphs showing results from a sample of 20 or even fewer. It’s no good putting a little footer to say “Caution – low sample size” – your clients will ignore it and happily quote percentages based on ridiculously low samples.

    1. Totally agree. It’s like saying we know it’s small but it’s the only interesting result we had (because random chance makes all kind of cool invalid results).

  6. I love this post, especially the “Being bound by statistical significance, or lack thereof”. No one seems to make a big deal of “type 2 errors”.

  7. Woot woot! I like the part about publish or perish. I’ll send you the link to my new study where I definitively prove that 67.49% of all statistics are made up on the spot, with the remaining 46.1% being taken out of context. And the last 3% are chocolate covered 🙂

    1. You got an LOL out of me. Make that a 3D pie chart that blinks and I’ll likely pass out.

  8. Great post. Many fieldwork companies have very little knowledge of the research objective, and in many cases lack proper sampling knowledge as well. If their clients don’t fill in the gaps and hand-hold every step of the way, there is a huge risk that the data they get in the end is of little value.

%d bloggers like this: