Here’s Why All the Polling Models Are Probably Right

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Paul Waldman describes the Bizarro-world war on polling guru Nate Silver that’s gained steam over the past week:

In the last few days, we’ve seen a couple of different Silver narratives emerge as attention to him has increased. First, you have stories about how liberals are obsessing over Silver, “clinging” to him like a raft in a roiling sea of ambiguous poll data. Then you have the backlash, with conservatives criticizing him not because they have a specific critique of the techniques he uses, but basically because they disagree with his conclusions….Then you’ve got the reporter backlash. At Politico, Dylan Byers raised the possibility that Silver would be completely discredited if Mitt Romney won, because “it’s difficult to see how people can continue to put faith in the predictions of someone who has never given that candidate anything higher than a 41 percent chance of winning.”

This whole thing is deeply weird. Could Nate be wrong? Sure. Ditto for Sam Wang and Drew Linzer and all the rest of the poll modelers out there. But if they’re wrong, it probably won’t be because of their models. After all, with minor differences they all do the same thing: average the publicly available state polls, figure out who’s ahead in each state, and then add up the electoral votes they get for each candidate. Sure, they all toss in a little bit of mathematical secret sauce, but not really all that much. You could do the same thing if you felt like it. Want to know who’s ahead in Ohio? Go add up the five latest polls and then divide by five. Voila. You are now your own Nate Silver.

Needless to say, though, the poll modelers are only as good as the polls they use. If the pollsters are systematically wrong, then the models will be wrong. And while there are a few small sources of potentially systematic bias (not calling cell phones, demographic weighting, etc.), by far the biggest is the pollsters’ likely voter screens. But even here, with one or two exceptions, this is pretty simple stuff. Most pollsters just ask a question or two that go something like this:

  • Are you planning to vote?
  • How sure are you that you’ll vote?
  • Really? Honest and truly?

That’s about it. If you tell them you’re highly likely to vote, they mark you down as a likely voter. If not, they don’t. There’s no rocket science here.

So if the modelers are wrong, it will probably be because the pollsters were systematically wrong. And if the pollsters are systematically wrong, it will probably be because this year, for some reason, people started lying about their likelihood of voting. And while anything’s possible, I sure can’t think of any reason why this year there would be a sudden change in how truthful people are about their intention to vote.

That’s what this whole controversy comes down to. Conservatives seem to be convinced that Democrats simply won’t turn out in high enough numbers to reelect Obama. A fair number of liberals fear the same thing. But there’s no analytic reason to believe this. The Obama campaign’s ground game seems to be as good as any in the business, and Obama voters are telling pollsters that they’re likely to vote in big enough numbers to give him the key swing states he needs to win. That’s the current state of our knowledge. It might be wrong, but if it is, the question isn’t going to be why Nate Silver went astray. The question is going to be, why was 2012 the year when people suddenly started lying to telephone pollsters?

UPDATE: Asawin Suebsaeng has a roundup of all the prognosticators here. It’s a nice, Cliff Notes version of who’s who and what they’re saying.
 

GREAT JOURNALISM, SLOW FUNDRAISING

Our team has been on fire lately—publishing sweeping, one-of-a-kind investigations, ambitious, groundbreaking projects, and even releasing “the holy shit documentary of the year.” And that’s on top of protecting free and fair elections and standing up to bullies and BS when others in the media don’t.

Yet, we just came up pretty short on our first big fundraising campaign since Mother Jones and the Center for Investigative Reporting joined forces.

So, two things:

1) If you value the journalism we do but haven’t pitched in over the last few months, please consider doing so now—we urgently need a lot of help to make up for lost ground.

2) If you’re not ready to donate but you’re interested enough in our work to be reading this, please consider signing up for our free Mother Jones Daily newsletter to get to know us and our reporting better. Maybe once you do, you’ll see it’s something worth supporting.

payment methods

GREAT JOURNALISM, SLOW FUNDRAISING

Our team has been on fire lately—publishing sweeping, one-of-a-kind investigations, ambitious, groundbreaking projects, and even releasing “the holy shit documentary of the year.” And that’s on top of protecting free and fair elections and standing up to bullies and BS when others in the media don’t.

Yet, we just came up pretty short on our first big fundraising campaign since Mother Jones and the Center for Investigative Reporting joined forces.

So, two things:

1) If you value the journalism we do but haven’t pitched in over the last few months, please consider doing so now—we urgently need a lot of help to make up for lost ground.

2) If you’re not ready to donate but you’re interested enough in our work to be reading this, please consider signing up for our free Mother Jones Daily newsletter to get to know us and our reporting better. Maybe once you do, you’ll see it’s something worth supporting.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate