Why should I believe your poll?

It’s a challenge pollsters and news organizations get all the time. And why not — the profusion of polls can be confusing. This week alone, one USA Today-Gallup Poll showed Barack Obama leading John McCain in the presidential race by 3 percentage points and a separate Gallup Poll showed Obama up by 8 points.

This year’s presidential race may produce more skeptics than usual, considering Obama’s bid to become the first African-American president of a nation where racial attitudes can be complex and concealed.

Some questions and answers about attempts to read public sentiment:

Q: Don’t people lie to pollsters?

A: Some do, but pollsters think it’s extremely limited and say people who dislike polls are likelier to refuse to answer them. Historically, more people tell pollsters they engage in socially desirable activities like voting than actually do so, and fewer admit to undesirable acts like using illegal drugs than really commit such acts. These answers may not be lies — they may reflect people’s self-images or selective memories. In an Associated Press-Yahoo News poll in June, 4 percent said they answer untruthfully when asked an uncomfortable question in a survey, and 7 percent said they sometimes do. Twenty-six percent said they sometimes or always skip the question. Yet 69 percent also said it’s important to express an opinion by answering polls.

Q: Won’t some people oppose Obama but not tell pollsters because they worry they may appear racist?

A: That’s a question pollsters are spending a lot of time on this year — and have studied in the past. In high-profile races pitting white and black candidates in the 1980s and early 1990s, the whites often got significantly more support in actual voting than in pre-election polls. That hasn’t been noticed so much recently, and pollsters debate whether this phenomenon has faded. CBS News examined its polling and found that people — especially blacks — were likelier to express support for Obama to black interviewers than to white ones. But that was true only for 2007, not this year. People are often reluctant to discuss how racial attitudes affect their voting. To try learning their views, pollsters use techniques like asking indirect questions such as whether interracial dating is acceptable and comparing how people respond to fictional candidates when the candidates’ races are varied.

Q: How can I believe polls when I never get called?

A: Pollsters conduct plenty of telephone interviews, but don’t hold your breath. For a typical AP-Ipsos poll, about 1,000 adults are interviewed. The Census Bureau says there are about 228 million adults in the U.S., so the odds of getting one of those calls are one in 228,000. That’s about twice as likely as the one in 500,000 odds the National Safety Council says a person has of dying by accidentally falling from a building in a year. Also, remember political polls are not predictions of who will be elected. They are snapshots of sentiment in a long campaign.

Q: Many pollsters don’t interview the growing number of people with cell phones. How can those polls be accurate?

A: It’s an increasing concern, but not a serious problem yet. According to federal figures, one in six households has only cell phones, and nearly as many have landlines but seldom take calls on them, often because they’re connected to computers. Many pollsters don’t call people with cell phones because it is more expensive, in part because federal law bars unsolicited callers like pollsters from calling cell phones using automated devices they usually favor. Cell users are likelier to be male, younger, lower income and minorities. So far, studies show their views are not significantly different from similar people with landlines.

Q: How do we know the opinions of 1,000 people reflect what the entire country is thinking?

A: Just as your doctor judges your health by testing only a tiny sample of your blood, pollsters are confident that surveying relatively small numbers of people is sufficient — assuming it’s done right. That means making sure the subjects are selected randomly, the questions are worded objectively and other criteria are met. The results are "weighted," or adjusted, to accurately reflect the country’s makeup by gender, race and other factors. That done, only one in 20 times should a sample of 1,000 people produce answers more than 3 percentage points different from the results had the entire country been polled.

Q: Why do polls released at the same time sometimes differ?

A: Question wording, the dates of interviews and whether the data measures adults, registered voters or likely voters affect the answers. Pollsters each use their own standards to determine who is a likely voter and who is undecided — variations that can have a big impact. The two Gallup Polls that gave Obama varying leads both covered July 25-27 and both measured the attitudes of registered voters, but they involved different numbers of people and had other methodological differences. The contrast in their results, while statistically significant, was not that substantial.

Comments are closed.