The time to analyze the state pollsters is now. Here's how we did it. For each pollster that ran at least five state presidential polls released after Oct. 1, each state polled by that pollster was examined separately. For each state-pollster combination, only the last poll was used. That poll was compared to the final result in the state. If, for example, a pollster said Obama was +3 in some state and the final result was Obama +1, the error was recorded as +2 (biased toward Obama). If the election result was Obama +5, then the pollster's score was -2 (biased toward Romney).
For each pollster, its score in each state it polled was averaged and a bias and error score computed. Bias is how much the average across states was skewed toward Obama or Romney. Error is the mean absolute value of how far off the pollster was. For example, a pollster who overrated Obama by 10 points in half the states and who underrated Obama by 10 points in the other half would have a bias of 0. The mean error would be 10, however, since every result was off by 10 points. The pollster was not very accurate, but didn't favor either candidate.
In contrast, a pollster who overrated Obama by 1 point in 5 states and who was spot on in 5 states would have a bias of +0.5 and also a mean error of 0.5. Although this pollster has a slight bias, it is clearly better than the first one, which has no bias but is wildly inaccurate.
Here are the numbers. Remember that negative numbers mean a bias toward Romney and positive numbers mean a bias toward Obama.
|U. of New Hampshire||2||-4.5%||4.5%|
|Pulse Opinion Research||5||-2.6%||2.6%|
|Pharos Research Group||7||+1.3%||3.0%|
Let's go over this to make it clearer. The University of New Hampshire polled only two states after Oct. 1: New Hampshire and Massachusetts. In New Hampshire, it predicted a 3-point Obama win. He won by 6. In Massachusetts, it predicted a 17-point Obama win. He won by 23. So the mean bias is (-3 + -6)/2 = -4.5. The mean error is the average of 3 and 6 or 4.5. Now consider ORC International (used by CNN). It's Colorado, Florida, and Ohio final polls were biased by -2, -2, and +1 respectively, for an average bias of -1.0, a small pro-Romney bias. The mean error, however, is (2 + 2 + 1)/3 = 1.7.
The bottom line is that the least biased pollster was ORC International, which overestimated Romney by 1%. ORC International was also the most accurate pollster, with a mean error of only 1.7%. Good job ORC International. (Note to CNN executives: hire them next time.)
Also noteworthy is that it is hard to find an article about the North Carolina PPP firm that doesn't describe it as "Democratic leaning." Actually, it had a 1.2% bias toward Romney. It overestimated his performance. Rasmussen also had a Republican bias and it was twice as large as PPP's. Only three pollsters, Angus-Reid, Pharos, and Zogby, were biased toward Obama, and then not by very much. The margin of error in most state polls is 3-4%, so a pollster with a mean error below 4% was pretty much on target. Only the University of New Hampshire and ARG fell off the boat.
Finally, all the comments from right-wing pundits that the pollsters were biased toward Obama were nonsense. Of the 14 major pollsters in our study, 11 had a Republican bias. Only three small pollsters had a Democratic bias.
This is pretty close to the end for 2012. Maybe we'll make up a page with the 2014 Senate races and maybe one on the 2016 candidates, but other than that, we've about covered 2012 now. If you want to see if there has been a posting, the easiest way is to subscribe to the Twitter or RSS feeds. Every posting is announced there. Thanks for visiting. Maybe we'll be back in 2014, maybe not. No promises.