Mobile Survey Data Quality: Jamie Baker-Prewitt #CASRO #MRX


casrobanner
… Live blogging from beautiful San Francisco…

 

Mobile Research Risk: What Happens to Data Quality When Respondents Use a Mobile Device for a Survey Designed for a PC” by Jamie Baker-Prewitt, Senior Vice President, Director of Decision Sciences, Burke, Inc.

  • Prior data shows that opt-in profiles of mobile completes were more likely to be younger, employed, ethnically diverse, response rates were slightly lower than web, some survey results were different
  • In traditional surveys and mobile web, most dropoff is extremely early; in SMS surveys the trend is steep but far more extended over time
  • Responders are choosing to use  a mobile device to complete a survey when we have designed it for  a PC
  • Two recent studies showed 7% to 17% of completes coming from mobile devices, and 4% to 8% coming from tablets
  • Research required people to take a survey on all three devices  but they only actually answered the 10 minute survey on one device
  • Tablet and smart phone adapted responses had lower survey lengths
  • Bad smart phone survey took a couple of minutes longer than tablet and smart phone adapted
  • Bad smart phone survey had much higher drop out rates – 18% vs 5% to 11%
  • [WHY do we keep using this “mark a 2”.  There is no more ridiculous measure of data quality than this!]
  • Straightlining is less of an issue on bad smart phone surveys maybe because people can only see a couple of items at a time anyways
  • Multiple errors higher on bad smart phone surveys
  • Responders generally prefer to take surveys on a PC but responders answering a good mobile survey prefer that method
  • Results of the four methods were generally the same
  • Non-response bias introduced by requiring responders to have all three devices [but it was a necessary evil]
  • Metnal cheating is not rampant in a 10 minute survey but bad smart phone surveys have higher drop out rates, more likely to straightline, more likely to not comply with instructions
  • Good smart phone surveys need to use fewer words were possible, make sure response alternatives are visible and not clunky to navigate, ensure respondents can tell when they response has been registered
  • Test the survey on a variety of devices and operating systems
  • We must accomodate responder preferences, we can no longer request that people only answer on a PC

One response

  1. Interesting study. However, i believe there’s a need to distinguish “mobile web” from pure “native app” experience. Each methods generally brings differents results with native app usually bringing much better responses rate and lower dropout.

    Antoine @datafieldapp

%d bloggers like this: