Presidential primary 2008 polls: What went wrong

Mar 30, 2009

University of Michigan survey experts working with the American Association for Public Opinion Research have identified several reasons polls picked the wrong winners in the 2008 Presidential Primary.

The study is believed to be the most comprehensive analysis ever conducted of presidential primary polls.

"The most jarring element of the presidential primary polling was that polls picked the wrong winner in New Hampshire," said U-M polling expert Michael Traugott, who chaired the AAPOR committee composed of leading academic and private sector experts in public opinion and survey research. "We wanted to find out why."

The results of the committee's analysis show that a handful of methodological missteps and miscalculations combined to undermine the accuracy of predictions about presidential primary winners in New Hampshire and three other states.

One source of error the researchers were able to eliminate was the so-called 'Bradley Effect,' in which people say they support a Black candidate in order to appear unbiased, but then cast their ballots for a white candidate in the privacy of the voting booth.

"Many New Hampshire polls predicted would beat Hillary Clinton in that state," said Traugott. "So when Clinton won, some people pointed to latent racism as the reason. But in the data we have from a wide variety of New Hampshire pre-election and exit polls, we found no evidence that whites over-represented their support for Obama."

For the report, supported in part by a grant from the U-M Institute for Social Research (ISR), Traugott and colleagues analyzed individual, household-level response data provided by seven polling organizations. They also compared information on question wording, weighting, interviewer characteristics, sampling frames, and other methodological issues from up to 19 other firms, in many cases relying on publicly available information gleaned from the Internet.

"This analysis suggests some important explanations for the errors in the 2008 New Hampshire Presidential Primary and raises significant questions about pre-election polling methods," said Richard Kulka, AAPOR president.

"The materials we received from polling organizations showed that there was much more variation in the methodology of pre-election polls than I ever imagined there would be," Traugott said.

The committee analyzed performance in four primary states: Wisconsin, South Carolina, California, and New Hampshire. Although the limited data they received made it impossible to conduct definitive tests of all likely sources of different poll performance, the following factors were identified as the most likely reasons the polls got it wrong:

• The New Hampshire primaries occurred only five days after the Iowa caucuses, truncating the polling period in New Hampshire.

• Most commercial polling firms conducted interviews on the first or second call, but respondents who required more effort to contact were more likely to support Clinton. Instead of reworking their initial samples to reach these hard-to-contact people, pollsters typically added new households to the sample, skewing the results toward the opinions of those who were easy to reach on the phone, and who typically supported Obama.

• Non-response patterns, identified by comparing characteristics of the pre-election samples with the exit poll samples, suggest that some groups that supported Clinton---such as union members and those with less education---were under-represented in pre-election polls, possibly because they were more difficult to reach.

• The order of candidate names on state primary ballots likely contributed to increased support for Clinton in New Hampshire, where her name appeared near the top of a long list of names and Obama's appeared near the bottom.

Traugott noted that the analysis also revealed wide variation in the primary question respondents are asked---the so-called trial heat question about which candidate they prefer in the coming election. In New Hampshire, there were 11 different question wordings used in the Democratic primary, and 10 different wordings used in the Republic primary. In some versions, the candidates' names were not mentioned at all. In others, only the "major" candidates were named. Some polls randomized the candidates' names.

"We also learned that some polling firms are buying lists of registered voters with phone numbers, and then they are contacting people with interactive voice response technology---basically computerized calls---and that they're taking information from the person who answers the phone which may or may not be the person identified in the sample," Traugott said. "This should be a focus of further research." Another firm interviewed any registered voter in the household.

Source: University of Michigan

Explore further: Decoding ethnic labels

add to favorites email to friend print save as pdf

Related Stories

Refining the science of public opinion polling

Mar 06, 2008

Nancy Mathiowetz has been busy tracking hundreds of public opinion surveys this year, detailing everything from pre-election polls to consumer confidence. She admits she is usually “drowning in data.”

Recommended for you

Decoding ethnic labels

5 hours ago

If you are of Latin American descent, do you call yourself Chicano? Latino? Hispanic?

Local education politics 'far from dead'

Jul 29, 2014

Teach for America, known for recruiting teachers, is also setting its sights on capturing school board seats across the nation. Surprisingly, however, political candidates from the program aren't just pushing ...

First grade reading suffers in segregated schools

Jul 29, 2014

A groundbreaking study from the Frank Porter Graham Child Development Institute (FPG) has found that African-American students in first grade experience smaller gains in reading when they attend segregated schools—but the ...

Why aren't consumers buying remanufactured products?

Jul 29, 2014

Firms looking to increase market share of remanufactured consumer products will have to overcome a big barrier to do so, according to a recent study from the Penn State Smeal College of Business. Findings from faculty members ...

Expecting to teach enhances learning, recall

Jul 29, 2014

People learn better and recall more when given the impression that they will soon have to teach newly acquired material to someone else, suggests new research from Washington University in St. Louis.

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

Gunzo
not rated yet Mar 30, 2009
Boil results down to their 'add value' level. Not enough people were asked in the polls leading up to the vote.