Are Election Poll Results Accurate?

5 Tips to Understanding Public Opinion Surveys

Voters In Super Tuesday States Cast Their Ballots
Ron Jenkins / Getty Images

There's a popular saying on the campaign trail: The only poll that matters is on Election Day. You'll typically hear that sort of dismissal of election poll results from candidates who appear to be losing.

Do they have a point? How much stock should you put in election poll results?

Polls are a staple of every election year. Dozens of private firms, media outlets, and academic institutions publish election poll results every campaign cycle. But reading election poll results can sometimes be confusing, especially if you're not familiar with the terminology and methodology.

While they can seem like just a jumble of unintelligible numbers, polls are very useful in gauging public opinion at a specific point in time. But before you try to read too much into a particular poll, keep in mind these important questions.

Who Conducted the Election Poll?

This is perhaps the most important question to ask before delving into any election poll results. Was it a university? A media outlet? A private polling firm? The polling institute must have a reliable track record.

Some of the most prominent and reliable firms that publish election poll results are Gallup, Ipsos, Rasmussen, Public Policy Polling, Quinnipiac University, and media outlets including CNN, ABC News and The Washington Post.

Be extremely skeptical of polls paid for by political parties or campaigns. They can be easily skewed to favor their candidates. Some "polls" are actually nothing more than political ads bought and created by campaigns.

Did the Pollster Disclose the Methodology?

First off: What's a methodology? It's a fancy term meaning the specific procedures used in conducted an election poll. Don't trust election poll results that come from an outfit that hasn't disclosed its methodology. Learning how they arrived at their election poll results is just as important as the numbers.

The methodology will explain, for example, whether the pollster sampled just land line telephone users or called cell phone numbers as well. The methodology should also disclose how many people were questioned in the poll, their party affiliations, the dates they were contacted and whether a real interviewer was on the line with the respondent.

Here is what a thorough methodology disclosure looks like:

"Interviews are conducted with respondents on landline telephones and cellular phones, with interviews conducted in Spanish for respondents who are primarily Spanish-speaking. Each sample includes a minimum quota of 400 cell phone respondents and 600 landline respondents per 1,000 national adults, with additional minimum quotas among landline respondents by region. Landline telephone numbers are chosen at random among listed telephone numbers. Cell phone numbers are selected using random-digit-dial methods. Landline respondents are chosen at random within each household on the basis of which member had the most recent birthday."

Margin of Error

The term margin of error seems pretty self-explanatory. Election polls survey only a small portion, a statistical sample, of the population. So the margin of error is used to describe a pollster's confidence that his survey of the smaller sample reflects the sentiment of the entire population.

The margin of error is expressed by a percentage.

For example, a 2012 Gallup poll measuring support for President Barack Obama and Republican Mitt Romney sampled 2,265 registered voters and had a margin of error of +/- 3 percentage points. The poll found that Romney had support from 47 percent, and Obama had support from 45 percent.

When the margin of error is factored in, the election poll results showed a dead-heat between the two candidates. The 3-point margin of error means that Romney could have had support from as much as 50 percent or as little as 44 percent of the population and that Obama could have had support from as much as 48 percent or as little as 42 percent of the population.

The more people who are polled, the smaller the margin of error will be.

Are the Questions Fair?

Most reputable polling firms will disclose the exact wording of the questions they ask. Be skeptical of election poll results that are published without disclosing the questions. The wording can without question cause errors or introduce bias into polls.

If the wording of a poll seems to paint a particular political candidate in a harsh or negative light, it is likely a "push poll." Push polls are designed not to measure public opinion but to influence voter opinion.

Pay close attention to the order in which the questions were asked, too. Be cautious about election poll results that come from a survey asking respondents about controversial issues just before asking them their opinion about a particular candidate.

Registered Voters or Likely Voters?

Pay attention to whether the survey asks whether the respondents are registered to vote and if so are they likely to vote. Election poll results based on a sample of adults are less trustworthy than those based on either registered or likely voters.

Related Story: What's a Swing Voter?

While polls based on responses from people who say they're like to vote are believed to be more accurate, pay attention to how close they are conducted before an election. Many voters can't say with much certainty whether they will likely vote in an election six months from now. But if they're asked two weeks before an election, that's a different story.

Explains the Pew Research Center:

"One of the most difficult aspects of conducting election polls is determining whether a respondent will actually vote in the election. More respondents say they intend to vote than actually will cast a ballot. As a consequence, pollsters do not rely solely upon a respondent’s stated intention when classifying a person as likely to vote or not. Most pollsters use a combination of questions that measure intention to vote, interest in the campaign and past voting behavior."