Pollsters' perplexing problem
We in the US will be awash in polling numbers for the next nine months in the run-up to the presidential election. An article in the New Yorker casts some doubt on how representative these polls really are, and debates whether they are negatively impacting our democracy.
In the 1930s, polling typically was conducted door-to-door and response rates were around 90%. People considered it a civic duty to respond to polls. By the 1980s, response rates were down to 60%. Today they are less than 10%. Polling firms, of course, use statistical weighting to ensure a representative sample, but this brings some clutter into the analysis. Although modern general election polling in the US has remained quite accurate so far, pollsters in Israel, Greece and UK have been dramatically off-target in recent elections.
What do these kinds of low response rates mean, if anything, for qualitative research? How do we (and our recruiters) make sure that the people we are interviewing are representative in some way? Does this issue even matter as much in qualitative research?
(By the way, if you are interested in a deeper discussion of the topic, the author of this article, Jill Lepore, along with statistician and polling analyst Nate Silver of fivethirtyeight.com were guests on NPR’s “Fresh Air” last week.)