Tag Archives: Methodology

Training for survey research: who, where, how #AAPOR #MRX 

moderated by Frauke Kreuter

Prezzie #1: training needs in survey methods

  • started a program in the 1970s with 4 courses, two in statistics and 2 in sampling, that was pretty good at the time, it covered the basics well
  • in 1993, 3 courses data collection, 3 in sampling, 2 practicums, 4 statistics, 3 design classes, 1 on federal statistical system
  • many journals have started since then, survey methodology, journal of official statistics, POQ and aapor publications journal of survey statistics and methodology, international conferences, now entire conference on total survey error
  • statisticians need to know basic theory, sampling theory of complex designs and weighting and imputation, small area estimation, disclosure control, record linkage, paradata, responsive design, panel survey methods, survey management, ethics; it’s impossible to know about and training for everything now
  • in early days, treating humans nicely was never mentioned, it wasn’t important; now we realize it’s important [yet we still don’t treat people as nicely as we ought too. isn’t a long, poorly designed, poorly written survey disrespectful?]
  • a masters degree can no longer cover everything we need to know as survey researchers, can run summer programs for training, can do online training, can do advanced certificate programs
  • the world is training so fast so how can training keep up with everything, punch cards are history and whatever we’re doing now will be history soon enough
  • we need to train more people but undergrads don’t know about our field

Prezzie #2: training for modern survey statisticians

  • survey practice journal special issue – February 2015
  • might be 147 federal positions per year advertising for statisicians, we are training only about a quarter of what’s needed
  • we need core statistical skills but also communication and presentation skills
  • training gap right now is most grad courses only have one course in sampling
  • most courses use R (55%)
  • only 40% of courses are taught by faculty who work specifically in statistics
  • weighting is a major gap, don’t talk about non-response adjustments
  • big aining gap in design trade offs – discrete parameters, continuous parameters, split sample randomization
  • most training happens on the job
  • [this session is so popular I can’t put my feet up on the chair in front of me! The room is full!]

Prezzie #3:  assessing the present

  • our science draws on many other disciplines
  • trained in the methods and how to evaluate those methods, trained in qual and quant like ethnography and psychmetric analysis
  • there are five university based programs, mostly at the graduate level, plus professional conferences, short courses and seminars
  • current programs do the core well, increasing focus on hybrid training, trainers are also practitioners which is invaluable
  • training gap on full survey life cycle experience, not enough practical experience in the training, not enough multi-cultural training and the population has a large and growing non-native english speaking base
  • quant dominates most survey programs [well of course, a survey program is surveys, why not develop a RESEARCH program]
  • you can have a productive career with little statistical knowlege, you can be a qual researcher [well that’s just offensive! why shouldn’t qual researchers also know statistics?]
  • ideal program still needs the core classes but it also needs more qual and user experience, more specialized courses, more practicums, more internships, more major projects like a thesis

Prezzie #4:  on the job training

  • she did interviews with people for her talk – she’s qualitative 🙂
  • the workplace is interdisciplinary with many skill sets and various roles
  • know your role – are you a jack of all trades or filling a niche
  • in private business, everyone knows a bit about everything
  • at the census bureau, it’s massively specialized – she works on pre-testing of non-english surveys
  • you need to create opportunities for yourself – request stretch tasks, seek mentors, volunteer to help with new projects, shadow experienced people – screen sharing is a wonderful thing
  • take short courses, pursue graduate degrees, read and present – you are responsible for your future growth
  • as management you can – promote learning by doing, share the big picture, encourage networking, establish a culture of ongoing learning
  • you can learn on the job without money

Prezzie #5: future of training

  • we are in the midst of a paradigmatic shift in our industry
  • survey informatics – computer science, math and stats, and cognitive and social spcyhology – this is the new reality
  • resistance to online surveys is the same as the emergence of the telephone survey – skeptical, resistant
  • the speaker was a heretic when he first started talking about online surveys
  • we need technology and computers for higher quality data, increase quantitative in data collection
  • we now have paradata and metadata and auxiliary data – page level data, question level data, personal level data, day level data
  • data is no longer just the answers to the questions, methodologists need to understand all these types of data
  • concerned we’re not keep up and preparing the next generation
  • [discussion of how panels can be good, like people have never heard of panels, sadly some people do need to hear this]
  • computer science must be our new partner [go home and learn R and Python right now]
  • we won’t have to ask are you watching TV, the TV will know who’s the room, who’s looking at the TV, who left to get a snack
  • least powerful low level professors who know the new tech have no power to do anything about it and they have no funding
Advertisements

Transformation of Market Research by Jeffrey Henning #NetGain8 #MRX

Live blogging from MRIA’s #NetGain8 conference in Toronto. Any errors or stupid jokes are my own.

Netgain8The Transformation of Market Research: Where to next? by Jeffrey Henning, President, Researchscape International, Boston

  • Innovation in 1994 – a bike lane on Davenport
  • 1923 – it was a railway
  • 1904 – it was horses and buggies, beginning of electrification
  • Previously, the road was a first nations pathway and that is why it doesn’t follow traditional street ways – it used to be the shoreline!  [Cool!  I live here and didn’t know that!]
  • profession of building canoes probably lasted thousands of years, building buggies was probably hundreds of years before they went obsolete
  • So will market research become obsolete? [and here, Jeffrey yells it out 🙂 ]
  • Only methods become obsolete, not professions.
  • Currency will always exist but it will look and work differently
  • Market research is eternal, or nearly
  • Pressure points on survey research – victim of its own success – overloading inboxes, bad questionnaire design, long surveys, complex designs
  • Drivers of survey research – mobile surveys. micro surveys, DIY, Enterprise Feedback Management, Voice Of Consumer systems, emotion capture, feature wars, automation, very crowded space
  • In 1987 – questionnaire design was in wordperfect, surveys were by phone mail or face to face, results analyzed in spss and lotus, results presented in wordperfect — the human provided most of the value
  • In 1997 – finally a product to help with survey design (surveysolutions or EZSurvey) and fielding and analysis, results were now presented in powerpoint – more automation now but still a lot of human work
  • In 2013 – survey templates and question libraries, automated fielding, partially automated analysis, partially automated presentations
  • More companies are doing proprietary analytics with traditional questions and choosing pictures to represent feelings
  • Weighting used to be cone using statistical systems under direction of an analyst, open ends were rarely analyzed, and crosstabs showed everything not just the meaningful differences
  • Google applies automatic weighting to its analysis though it’s not perfect, they guess who you are based on your browser data
  • WYWIWYG – what you want is what you get – the system knows exactly which statistic should be applied – you just say what variables you want to test, you don’t need to understand how to use SPSS or SAS or R – StatWing
  • Automation to come – crowd shaped surveys, better weighting, text analytics, proprietary analytics
  • 2018 – human designs the questionnaire and not much else of the process – self driving car… self driving survey research not so far fetched – maybe we should become qualitative researchers 🙂
  • We need to develop proprietary indices, pioneer new techniques, invent unique interfaces, build proprietary panels, custom programming, more qual, benchmark databases, text analytics, be creative, good infographics, internalize research
  • “The best way to predict the future is to invent it” – Theodore Hook

The Audience Question I Fear the Most #MRX

I’ve done my fair share of webinar, workshop, and conference presentations. I’ve talked to groups of ten, a hundred, and three hundred people and there is usually an audience member with a point to prove. They’re skeptic of the topic, eager to prove they’re smarter than everyone else, or just keen to listen to their own voice.

They ask questions designed to identify flaws in methodology or logic, and force me to respond with answers like “You’re right, that is a problem” or “True, we should have done a better job with that,” or “Right, sentiment analysis will never be perfect.” But you know, those kinds of questions don’t scare me. Every research project that was ever conducted was rife with logic and methodology errors and I’m glad that researchers are picking up on them. That’s how the cycle of research works. Find a problem this time, fix the problem the next time.

Then there are questions from people who are keen about the work that has been presented and want to learn more about it. They’ve seen the tables and graphs that I’ve prepared. They’ve listened to the pros and cons of the methodology. They’ve seen the real life examples of misleading and poorly done research. With all of that information, they understand the good and bad of what they’re potentially getting into and they’re intrigued. Intrigued enough to ask that dreaded question.

Now first, you have to recognize that I am a researcher to the core. I deliberately chose to take every single research methods and statistics and psychometrics class in school, even going so far as to get special permission to take classes that were outside of my curriculum. Research methods was and is all I care about.

My problem is this. Every conference submission form and every followup email from conference organizers says you must educate and not promote and I wholeheartedly agree with those rules. I don’t want a sales pitch at a conference. I want to learn something new.

But then that horrible question gets asked. It comes in one of two forms. 1) “What software are you using to do this?” and 2) “How much is your software?” Anyone who’s seen me present can attest to my reaction to those questions. They immediately catch me off guard and I stumble through answers like an idiot. Answering these questions makes me feel like I am promoting and selling. Indeed, I feel that simply speaking in front of a crowd is a sales pitch.

Even worse, as soon as I’m asked to talk brands and money, I fear that people will distrust my research objectivity. I fear that people will disregard my passion for quality and honesty and see not the sense, but the cents. That, my research friends, is truly terrifying. But perhaps I’m over thinking things. Perhaps those questions are just nice compliments, confirmations that I’ve done the job I was supposed to do. Perhaps it really was a sales pitch, a sales pitch done the right way. But I still hate getting that question.

Would you prefer to kill someone over helping families? #MRX

So what’s your answer? I’m going to take a wild stab in the dark and assume that 99.9% of people would say no to that question.  There is only one way to answer it without making yourself feel like a horrible person. This is what makes it a leading question.

Now have a look at this survey that I received in the mail from Mr Bob Rae, my member of parliament.

Continue reading →

Twitterversity: It’s University in Pajamas! #MRX


Remember the good old days in university? Skipping class because it’s raining outside… Sitting at the back of class so you could doze off a little bit…

Well, next Tuesday January 11th is your opportunity to make up for all of that! Join Kathryn Korostoff, aka @ResearchRocks for an entire day of market research education via twitter. It’s the first ever University on Twitter and you can do it all without leaving the comfort of your pajamas! Continue reading →

The Lost Art of Qualitative Research

Is it lost or did it barely exist to begin with?

As I think back through my academic career, I realize that qualitative research was the one major missing piece. I took innumerable courses on statistics and design, but the focus without exception was always quantitative.

As part of my undergraduate studies, I did contribute to an ethnographic study of small companies, and also for a content analysis study about babies who failed to thrive. Both led to fascinating discoveries about the respective topics simply through the analysis of words.

But, these studies were not part of the curriculum. They were simply some of my after school activities. They were just things I volunteered to do because they were interesting and I felt they enhanced my course work.

I don’t know why curriculums are set up like this, set up where you only need to know one side of the coin. With social media research just over the horizon and ready to pounce with force never before seen, perhaps it’s time for a change.

IMG_3353.JPG
IMG_3353.JPG‘ by trekbody via Flickr
Image is licenced under a Creative Commons Attribution licence

Read these too

 

  • I don’t give a rat’s ass about probability sampling
  • Building a bad reputation before we even start: Privacy in social media research
  • 2011 Market Research Unpredictions #MRX
  • Six things that terrify me in market research #mrx
  • To Mid-Point or Not To Mid-Point, That is the Question
  • %d bloggers like this: