Saturday, March 21, 2009

What People See -- in Surveys

An interesting Public Opinion Quarterly article examines how people "see" a questionnaire. They use eye-tracking hardware and software to examine, among other things, how respondents view a survey question and response alternatives (our fancy way of saying the various choices you're given online when asked to answer a survey question).

According to the authors, when all choices are visible, "respondents looked at the options near the top of the list longer than they looked at those at the bottom. In addition, they were more likely to choose the options at the top."

This neat study doesn't look specifically at political knowledge, so I'm going to extrapolate and ponder here just what this tendency might mean for respondents facing questions designed to tap their knowledge of politics and public affairs.

Could this introduce random or even systematic error into responses? Yes. Could this lower our estimation of people's actual knowledge? Probably. Should be put the "right" answer near the top so they can find it? I dunno, but my hunch is no.

I'd love to see this approach used specifically on political knowledge questions and put the "correct" answer in different locations to see whether that matters. All students know that on multiple choice tests, if all else fails, choose "C" as your answer, so I wonder whether in survey items in which there is a "correct" response (versus just your attitude about some policy or issue) the respondents would be more likely to seek out that right answer. I think so, but it needs testing.

Unfortunately I don't own some eye-tracking gear. Sigh.

No comments: