Tag Archives: eye tracking

Eye Tracking in Survey Research #AAPOR 

Moderator: Aaron Maitland, Westat; Discussant: Jennifer Romano Bergstrom, Facebook 

Evaluating Grid Questions for 4th Graders; Aaron Maitland, Westat

  • Used to study cognitive processing
  • Does processing of questions change over the survey period
  • 15 items, 5 content areas about learning, school, tech in school, self-esteem
  • 15 single items and 9 grid items
  • Grid questions are not more difficult, only first grid takes extra consideration//fixation
  • Double negatives have much longer fixation, so did difficult words
  • Expressed no preference for one type of question


Use of Eye-tracking to Measure Response Burden; Ting Yan, Westat Douglas Williams, Westat

  • Normally consider length of interview, or number of questions or page; but these aren’t really burden
  • Attitudes are a second option, interest, importance, but that’s also not burden; Could ask people if they are tired or bored
  • Pupil dilation is a potential measure, check while they recall from memory, pay close attention, thinking hard, these things are small and involuntary; related to memory load
  • 20 participants, 8 minute survey, 34 target questions, attitude and behavioural questions, some hard or easy
  • Asked self reported burden on 4 items – how hard is this items, how much effort did it take to answer this
  • Measured pupil diameter at each fixation, base diameters differ by person, they used dilation instead, used over a base measure, percentage over a base, average dilation and peak dilation
  • Dilation greater for hard questions, peak 50% larger for hard questions, statistically significant though raw number seems very small
  • Could see breakoffs on the questions with more dilation 
  • Sometimes not consistent with breakoffs
  • Self report did correlate with dilation 
  • Can see people fixate on question many times and go back and forth from question to answer
  • Question stems caused more fixation for hard questions 
  • Eye tracking removes bias of self report, more robust
  • Can we use this to identify people who are experiencing too much burden [imagine using this during an interview, you could find out which candidates were having difficult answering questions]



The Effects of Pictorial vs. Verbal Examples on Survey Responses; Hanyu Sun, Westat; Jonas Bertling, Educational Testing Service Debby Almonte, Educational Testing Service

  • Survey about food, asked people how much they eat of each item
  • Shows visual or verbal examples 
  • Measured mean fixation
  • Mean fixation higher for pictorial in all cases, more time on pictures than the task, Think it’s harder when people see the pictures [i’d suggest the picture created a limiting view of the product rather than a general view of ‘butter’ which makes interpretations more difficult]
  • No differences in the answers
  • Fixation times suggests a learning curve for the questions, easier over time
  • Pictorial requires more effort to respond 


Respondent Processing of Rating Scales and the Scale Direction Effect; Andrew Caporaso, Westat

  • Some people suggest never using a vertical scale
  • Fixation – is pausing
  • Saccades – is the rapid movement between pausing
  • Respondents don’t always want to tell you they’re having trouble
  • 34 questions, random assignment of scale direction
  • Scale directions didn’t matter much at all
  • There may be a small primacy effect with a longer scale, lower education may be more susceptible 
  • Fixations decreased over time
  • Top of scale gets most attention, bottom gets the least [so, people figuring out what the scale it, you don’t need to read all five options once you know what the first one particuarly for an agreement scale. Where people can guess all the answers from the first answer. ]

Radical Market Research Idea #7: Participate in an untrusted methodology #MRX

If you get right down to it, I’m a quant. My history is with surveys and quantitative social media research. I have little experience with focus groups or neuroscience or eye tracking or many other respected and mistrusted methodologies. I can criticize the heck out of any of them but then, it really wouldn’t be fair.

But market researchers love to criticize. That’s what the test control design is set up to do. Prove and disprove based on logic and facts. So we criticize methodologies we aren’t familiar with even when we don’t have the facts. Are you one of the people who’s lambasted focus groups for their lack of generalizability? Have you laughed neursocience studies off the stage for their hocus pocus?

If you’ve never participated in a focus group, commission one now. Participate in the sampling, help write the discussion guide, help lead a group, help write up the results. See for yourself the good and bad that can come from it. Compare the results with those that come from the good and bad of the method you’re most familiar with. Learn something new. Try something new for once. Radical?

 

My Tobii demo, I FINALLY get to try eyetracking! #TMRE #MRX

If you follow my tweets at all, you know I’ve been itching to try out some eyetracking software. Tobii finally gave me the chance to do that at the TMRE. Below are two videos. The first is me watching the commercial, kindly filmed by the Tobii folks. The second is the result of my viewing where you can see what my eyeballs focused on as I watched the commercial. Very neat stuff!

What do people look at on store shelves? On magazine ads? On car dashboards? In the candy store? On the shelf of 18 pies…. Wait… It’s lunch time:)

Follow

Get every new post delivered to your Inbox.

Join 12,282 other followers

%d bloggers like this: