Some people think that certain pollsters cook the books to make their party look better. This pages provides some data to help you form an opinion on the subject.

Maps per Pollster

One way to examine this issue is to make a different map for each pollster to see if there are aberrations. The links below list the major multistate pollsters. Clicking on a link takes you to a map showing the most recent poll in each state that pollster works in. A white state indicates that the pollster has not conducted an election survey there in 2004. Before condemning a pollster for having data that differ from the current polls, check the date of the poll by mousing over the state. It may be old.

American Research Group
Quinnipiac University
Research 2000
Strategic Vision
Survey USA

Data per pollster in .csv format collected in a .zip file.

State-by-State Comparison of Pollsters

For some of the battleground states, multiple pollsters have been conducting surveys. It is instructive to examine charts of different pollsters for the same state to see how they compare. The graphs below cover only states in which at least two pollsters conducted at least three polls each.

Florida (American Research Group, Gallup, Rasmussen, Zogby)
Iowa (Strategic Vision, Zogby)
Michigan (Rasmussen, Strategic Vision, Survey USA, Zogby)
Minnesota (American Research Group, Rasmussen, Strategic Vision, Zogby)
North Carolina (Rasmussen, Research 2000)
Ohio (American Research Group, Gallup, Rasmussen, Zogby)
Washington (Survey USA, Zogby)
West Virgina (American Research Group, Zogby)

Predicted Versus Actual Results

A spreadsheet comparing the most recent poll from each of eight pollsters with the final tallies is available in Excel and .csv format. There are various ways to interpret the data. The most technical way is to see if the actual results fell within the margin of error. For the state polls, the MoE is generally about 3.5%. Thus from a purely technical sense, if a pollster predicts that candidate A will beat candidate B in a state by 51% to 49% and the reverse happens, the pollsters is actually correct since the final result is within the margin of error. Anyone wanting to do this analysis should download the spreadsheet as the data is there. Just assume the MoE was 3.5%

However, many people find it unsatisfactory to say that pollster got it right when the predicted candidate lost. To satisfy these people, the spreadsheet presents an alternative analysis, counting the number of states each pollster got right. Columns B and C give the actually Kerry and Bush percentages for each state. Column D indicates who won the state. Then come 24 columns, 3 for each of eight pollsters, giving their final prediction and the predicted winner. States that the pollster got wrong are marked in gray. The percentage of states the pollster got right is given below the states.

Using this analysis, the best pollster was Rasmussen, who got all 32 states that he predicted right. What is evn more amazing, is that Rasmussen uses a robodial system that is completely automated. A computer announces itself and then says (in effect) "Press 1 for Bush, Press 2 for Kerry." Despite the danger of noncitizens, children, and other nonvoters answering the poll, Rasmussen got every state right. One factor to consider is that most Rasmussen polls were 7 or 14 days, vs. a typical 3 days for the other pollsters. Rasmussen gets an A+

Next in line were Survey USA with 97% right and Mason-Dixon with 96% right. They get an A. American Research Group (92% right) and Zogby (91% right) get an A-. Zogby made some predictions on television that were off, but his polls (as opposed to his personal predictions) were pretty good.

Gallup, once the dean of American pollsters, got only 80% right, a B-. Finally Strategic Vision (R) got only 64% right, for an F. Strategic Vision should not be taken seriously in the future.

Back to the main page.