Innovation in Web Data Collection: How ‘Smart’ Can I Make My Web Survey?” by Melanie Courtright, Ted Saunders, Jonathan Tice #CASRO #MRX

Live blogging from the #CASRO tech conference in Chicago. Any errors or bad jokes are my own.

“Innovation in Web Data Collection: How ‘Smart’ Can I Make My Web Survey?”

Melanie Courtright, Senior Vice President, Client Services, Americas, Research Now; Ted Saunders, Manager, Digital Solutions, Maritz Research Inc.; Jonathan Tice, Senior Vice President, Decipher

  • The proportion of respondents taking surveys on tablets and mobile phones continues to increase
  • Researchers are exploring ways to improve data accuracy and the respondent experience on mobile surveys
  • Mobile or touch-screen devices enable different ways of interacting with the respondent to capture responses
  • Programmers used to developing mobile applications may naturally want to extend such features to surveys, and researchers may see these features as new and inventive
  • Want to use a randomized experimental design to see how data quality and respondent experience is affected
  • Web-based survey fielded April 2014; Auto insurance satisfaction and attitudes; 3,600 respondents, 10-minute survey, 60 questions; Respondents self-selected into PC, mobile phone and tablet cells;  Optimized for Tablet and Mobile
  • Melanie Courtright

    Slider start position influences “passive” responses

    • Respondents instructed to click on the slider button
    • Sliders were programmed to record a lack of use as missing
    • People are generally satisfied with their auto insurance company, so saw a large amount of non-use of the Right slider starting position
  • Slider start position matters more on touch devices
    • The right starting position tended to bias mean scores upward
    • PC users using a mouse were not as affected by the slider start position as much and had much less passive use of the slider on both 5- and 11-points
  • Touch device users who used sliders liked them
    • Respondents generally preferred the type of scale they used throughout the survey
    • PC users preferred Standard and had No Preference more.
    • Touch device users had slight preference for Sliders when they used them
  • Sliders may be suitable for continuous objective measures
    • Tests so far have been on attitudinal response scales.
    • Sliders may be more appropriate for entering an objective value
    • Prevents touch users from having to type
    • Responses similar to those entered into a text box
  • Length of the list matters more than style of list
    • Respondents were asked to give a time from 3 randomly assigned list lengths (of different granularities), on either a Radio button list, or a Drop-down list.
    • The longer the list, the more likely respondents chose a time earlier on the list, regardless of how the list was presented.
  • The display of Drop-down lists varies by browser. The Safari browser is dominant on iPhones, but browsers vary more on Android phones.
  • Jonathan Tice

    Android browser differences result in primacy effects. Because the default Android browser only shows the first three choices on the list and doesn’t easily scroll, those choices where selected much more often when shown in a drop-down.

  • Unprompted use of Voice-To-Text is very low. Even when asked, most respondents didn’t use it. About 90% of Tablet users opted to type in their response, either because they didn’t have the functionality, or didn’t want to use it. Mobile users were more evenly split between using it and not wanting to
  • Don’t want to use it because of heavy accents for hispanic people or asian people, need to be quiet, won’t be as accurate, environment was too loud
  • premature to recommend it on a wide basis but people are becoming more familiar with itRespondents using Voice-To-Text gave slightly longer answers
  • Using Image or Video Capture on mobile devices – there are many differences by OS, browser, and screen size which will affect results
  • request for a generic image of “where you are” led lots of feet, selfies, and friends – literal definition of “where you are” 🙂
  • environments ranged from home, school, church, beach, office, bars, labs, gyms, cars, hospital, kitchens, bathrooms, airports, malls
  • It’s in-experience data without any delay
  • Embedded image permalinkPicture quality instructions recommended: Blurry, bad lighting and more
  • These are not professional photographers. Some, but very limited, importing of existing pictures. Question wording is critical. Photo size is a double-edged sword. Internet connection speed and latency is worth considering
  • reason for not using was it seemed intrusive but it was all by full permission
  • There were fewer “easy” ratings for tablets vs phones
  • it is candid, personal, and open, it is in the moment and in-context, no geographic limitations, no real tech issues
  • But, the data needs to be reviewed individually, and it doesn’t work on non-smartphones
  • still need to test a lot and educate people on why and how to do it. Yet, consumers are still quite ahead of us when it comes to tech
  • consider rating every survey on it’s mobile friendliness – open ends, length, LOI, scale lengths, grid lengths, use of flash, rich media, audio or video streaming which add bandwidth, responsive design – all contribute to whether a study will work well on a phone. consider incenting CLIENTS for mobile friendly surveys
  • also consider designing every survey from stage 1 for mobile phones as opposed to adapting a web survey to phone


Other Posts

Enhanced by Zemanta
%d bloggers like this: