Surveying sensitive issues – challenges and solutions #ESRA15 #MRX  

Live blogged at #ESRA15 in Reykjavik. Any errors or bad jokes are my own. Break time brought some delightful donuts. I personally only ate one however on behalf of my friend, Seda, I ate several more just for her. By the way, since donuts are in each area, you can just breeze from one area to the next grabbing another donut each time. Just saying…

surveying sensitive questions – prevalence estimates of self-reported delinquency using the crosswise model

  • crime rates differ by country but rates of individuals reporting their own criminal behaviour shows opposite expectations. Thus countries with high rates have lower rates of self-report. Social desirability seems to be the case. Is this true?
  • Need to add random noise to the model so the respondent can hide themself. Needs no randomization device.
  • ask a non-sensiive quesiton and a sensitive question and asked to answer both the same way. Let the respondent indicate whether the answer to both are the same or different. You only need to know the answer of the first question (e.g., is your moms birthday in january? well 1/12 are in january).
  • crosswise model generates vastly high self-criminal rates in countries where you’d expect.
  • also asked people in the survey whether they answered carefully – 15% admitted they did not
  • crosswise results in mugh higher prevalence rates, causal models of deliquent behaviour could be very different
  • satisficing respondents gives less bias than expected
  • estimates of the crosswise model are conservative

pouring water into the wine – the advantages of teh crosswise model asking sensitive questions revisited

  • its easier to implement in self-administtered surveys, no extra randomization device necessary, cognitive burden is lower, no self-protection answering strategies
  • blood donation rates – direction question says 12% but crosswise says 18%
  • crosswise model had a much higher answering time, even after dropping extraordinarily slow people
  • model has some weakneses, the less the better approach is good to determine if the crosswise model works
  • do people understand the instructions and do they specifically follow those instructions

effects of survey sponsorship and mode of administration on respondents answers about their racial attitudes

  • used a number of prejudice scales both blatant and subtle
  • no difference in racial measures on condition of interviewer administration
  • blatant prejudice scale showed a significant interaction for type of sponsor
  • matters more when there is an interviewer and therefore insufficient privacy
  • sponsor effect is likely the result of social desirability
  • response bias is in opposite direction for academic and market research groups
  • does it depend which department does the study – law department, sociology department

impact of survey mode (mail vs telephone) and asking about future intentions 

  • evidence suggests that asking about intent to get screened before asking about screening may minimize over reporting of cancer screening. removes the social pressure to over report.
  • people report behaviors more truthfully in self-administrered forms than interviews
  • purchased real estate on an omnibus survey
  • no main effect for mode
  • in mail mode, asking about intent first was more reflective of reality of screening rates
  • 30% false positive said they had a test but it wasn’t in their medical record
  • little evidence that the intention item affected screening accuracy
  • mailed surveys may positively affected accuracy – but mail survey was one topic whereas the telephone was omnibus

effect of socio-demographic (mis)match between interviewers and respondents on the data quality of answers to sensitive questions

  • theory of liking, some say matching improves chances of participation, may also improve disclosure and reporting, especially gender matching
  • current matched within about five years of age as opposed to arbitrary cut-off points
  • also matched on education
  • male interviewer to female interviewee had lowest response rate
  • older interviewer had lower response rate
  • no effects for education
  • income had the most missing data, parent’s education was next highest missing data likely because education from 50 years ago was different and you’d have to translate, political party had high missing rate
  • if female subject refuses a male interviewer, send a female to try to convince them
  • it’s easier to refuse a person who is the same age as you [maybe it’s a feeling of superiority/inferiority – you’re no better than me, i don’t have to answer to you]
  • men together generate the least item non-response
  • women together might get too comfortable together, too chatty, more non-response, role-boundary issue
  • age matching is less item non-response
  • same education is less item non-response, why do interviewers allow more item non-response when theirrespondent  has a lower education

Related Posts

%d bloggers like this: