Evaluating response rates for web surveys #AAPOR #MRX 


prezzie #1: do response rates matter

  • response rates used to be an indicator of data quality, are participation rates meaningful?
  • completion rates were related to higher error [makes sense to me, making everyone answer a survy includes people who don’t want to answer the survey]
  • if you only surcey people who answer surveys, your response rates will be high
  • emerging indicator is cumulative response rate for probability panels = recruit rate * profile rate * participation rate
  • largest drop in rate is immediately after recruitment, by fifth survey the rate really slows down [this is fairly standard, by this point people who know they don’t like participating have quit]
  • by this measurement, the cumulative response rate had dropped to less than 3%, across all the groups the rate was less than 10%  [tell me again that a probability panel is representative. maybe if the units were mitochondria not humans who self determine, hello non-sampling error!]

prezzie 2: boosting response rates with follow-ups

  • 30% response rate with follow up compared to 10% with no follow up using the aapor response rate
  • follow ups helped a little with hispanic rates
  • helped a lot for cell phone only households
  • helped a lot for lowest and highest   income households, adults under 50 years old, high school only education
  • [hey presenters, slides full of text might as well be printed and handed out, and since i’m on this topic, yellow on white does not work, fonts under 25 point don’t work, equations don’t work. use your slides wisely! and please don’t read your slides😦 ]

prezzie 3: testing email invitations in a nonprobability panel

  • using mobile optimized forms🙂
  • used short invites sent from census bureau with census logo
  • best subject line as chosen by responders was – help us make the us census better, answer our survey
  • but real data showed this was worse than the others tested
  • best message was the message focusing on confidential, and possibly even better if you specify 10 minutes

prezzie 4: does asking for an email address predict future participation

  • response rates were 2 to 3 times higher for people who gave an email address
  • but it’s not exactly the email as an indicator, it’s people open to participating in further research
  • no effects by gender or ethnicity, graduate degree people are less likely to provide their email address

prezzie 5: predictors of completion rates

  • first selected only studies with completion rates over 60% [don’t know why you would do this, the worst surveys are important indicators]
  • completion rates are higherif you start with a simple multiple choice, lower if you start with an open end
  • introductory text and images don’t help completion rates
  • completion rates decrease as number of questions increase
  • higher completion rates if you put it all on one page insteading of going page page page page
  • a title increases the rate, and numbering the questions increases the rate for shorter surveys
  • open ends are the worst offendors for completion rates, question length and answer length is next worse [so stop making people read! you know they aren’t actually reading it anyways]
  • respondents don’t want to use their keyboards
  • avoid blocks of text

[my personal opinion… response rates of online panels have no meaning. every panel creates the response rate that is suited to their needs. and they adjust these rates depending on the amount and type of work coming in. response rates can be increased by only sending surveys to guaranteed responders or lowered by sending surveys to people who rarely respond. and by adjusting incentives and recruitment strategies you can also raise or lower the rates. instead, focus all your inquisitions on data quality. and make sure the surveys YOU launch are good quality. and i don’t mean they meet your needs. i mean good quality, easy to read, engaging surveys.]

by the way, this was a really popular session!

  

%d bloggers like this: