It seems that every morning after a primary, the talking heads on television try to parse out the "surprises" that occurred during the previous day's election. So what gives? Do the pollsters consistently make bad predictions? Or is there a reason we sometimes see discrepancies?
The Urban Edge interviewed David Dutwin about the issue. Dutwin is executive vice president at SSRS, a market and survey reserach firm (which also happens to administer the Kinder Houston Area Survey). Earlier this year, he penned a piece for the Washington Post rebutting some of the recent criticism of political polling, arguing the field isn't just alive and well -- it's vital to democracy. This interview has been edited and condensed for clarity.
What inspired you to write that piece?
There have been lots of articles and op-eds about the failures of political polling, here and in Great Britain. It felt it was a one-sided argument, and nobody was talking about it empirically. It just so happened I was doing a pretty large scale analysis of bias in public opinion polling over the last 20 years.
What did you find?
For web surveys, sometimes a measure is very unbiased. Sometimes it’s extremely biased. With telephone, there’s always a little bias — but it’s consistent. So there’s this “roll the dice” syndrome with Internet panels. On the web, you have no idea if you’re being accurate or not. And if you’re not, you can be very, very off.
Part of it’s the demographics. For example, web panelists are about 2 percent uninsured, and the U.S. is about 17 percent. That’s because nobody under the federal poverty level has the Internet. I don’t want to disparage them. I use them. But for policy research, it’s just something I’d stay away from.
So phone surveys work, even though they seem dated?
More than half of America no longer has a landline in their household. Cell phone penetration is in the 90 percent range. Samples on cell phones surveys look amazingly accurate. They almost don’t need to be weighted. You get random numbers and dial them, but you have to do it manually for cell phones. You can’t use an autodialer.
Aren’t the response rates low? That's bad, right?
Typically, the media’s public opinion polls have a 7 to 10 percent response rate. But as long as the people who refuse to be interviewed look the same as the people who are interviewed, it doesn’t matter if the response rate is 60 percent or 2 percent. There’s no significant difference between those who participate and those who don’t.
It’s not true there are no differences between the groups. But if you envision it as a spectrum, they’re much closer to being identical than not.
Reserach on this was done 10 years ago, and I’m finding the same things today. Response rates are low, but the people who participate largely look like those who don’t.
What are the biggest misconceptions about this year’s election?
I try to make a strong point that pollsters are going to miss elections. It’s happened in the past, and it will happen in the future. It doesn’t mean that we have to decry that polling has no utility and we can’t do it well. It’s a question of why does a poll get it wrong? There are a few possibilities.
An election poll can come out two or three days before the election, and there’s significant movement. Pollsters thought Trump would win Iowa, then there was serious movement in the last two or three days. The fact is that nobody polled in the last two to three days.
Also, you’re essentially surveying a population that doesn’t exist yet: people who actually show up to vote.
Pollsters have to use likely voter models to make an educated guess, basically, about who will show up and who won’t. It’s only about 7 percent of registered voters who caucus. You’re trying to figure out which 7 percent will show up, which has nothing to do with how well you field the survey.
Elections have been more volatile than ever, in terms of who shows up. In the last two presidential elections, for example, you had unprecedented numbers of non-whites show up. Then the midterm was unprecedented in terms of the number of elderly voters. So what assumptions do you make about what the electorate will look like this time? Nobody is sure until it happens.
How do you feel about news channels using polls to determine who gets to be on the main stage of the debates?
I’m perfectly comfortable with that. We’ve never had a field so large that we had to do that. What could possibly be more democratic? We have 40 years of scientific evidence that polls can obtain quite accurate representations of the American public.
That’s the science of polling. It can be done amazingly accurately. I think it’s one of the most democratic things that happened in this country… polls are this great voice of the common person.
You think polls contribute to democracy?
The early founders of the industry felt strong about that. How else do Americans voice their opinions? You can write to Congress or volunteer in a campaign. Many of us don’t have the time or energy to do that. So if you get called, don’t hang up. This is a unique opportunity to participate in democracy. And you don’t even have to leave your home.
It’s unfortunate that when politicians don’t like the results of polls, they bash them. But then, the second a poll gives him an answer he likes, he thinks it’s the greatest thing since sliced bread.
What should people know about polls to be more educated consumers of them?
There are good polls and bad polls. Internet polls are definitely getting better. They’re shown to do pretty well with political horserace measurement. But it’s still one of those things to be careful of. That would be the first thing I’d look at — is it an Internet or telephone poll?
And you need to understand primary polling has inherently a much greater degree of inaccuracy than general election polling. Primary polls tend to be wrong. The estimates they get are often 7 to 8 percent off from the final election results. General elections — for senator, governor, and president — tend to be a little above 2 percent off from the exact margin for both candidates.