Thursday, February 11, 2010

Polls Want to Eat Your Opinions, Spit Them Out

Polls, by themselves, are almost completely unreliable. They can be manipulated in so many ways, its disgusting. Not only can polls make it appear as if a majority feels a certain way even if that isn't the reality, the results of polls can actually sway public opinion itself. Let me point out some examples.

An article I recently stumbled upon in the Albany Times perfectly illustrates an idea I’ve always held: The words you choose make a difference. Two pollsters recently asked New Yorkers whether they support a proposed state sales tax on non-diet sodas. Though the goals of the two polls were similar - to test the public's opinion of the proposal - the responses they received were very different. That shouldn't come as a surprise though, given the wording of their questions:
From the Quinnipiac University Polling Institute: "There is a proposal for an 'obesity tax' or a 'fat tax' on non-diet sugary soft drinks. Do you support or oppose such a measure?"

From Kiley & Company: "Please tell me whether you feel the state should take that step in order to help balance the budget, should seriously consider it, should consider it only as a last resort, or should definitely not consider taking that step: 'Imposing a new 18 percent tax on sodas and other soft drinks containing sugar, which would also reduce childhood obesity.' "
It doesn't take a rocket scientist to realize that these questions wouldn't elicit the same answers. No one wants to be taxed for being overweight. And most people would do what they could to help reduce childhood obesity and all of the health risks that come with it.

Asking questions in a manner that purposely tries to illicit an emotional response is just one way that pollsters can sway the results of their polls. As this TPM article points out, simply supplying more or less possible options for a responder to answer can skew a poll:
Respondents were asked their approval of Obama using Rasmussen's usual format: Do they strongly approve, somewhat approval, somewhat disapprove, or strongly disapprove? The answer here is 47% approval, with 28% strongly approving, to 52% disapproval, including 41% who strongly disapprove.

However, Rasmussen got a different result when they asked the question as a simple "approve" or "disapprove." Obama then enters positive territory at 50% approval, 46% disapproval -- in line with a lot of other polls, such as the Gallup survey.
When given four options versus two, of course there is going to be a difference in the results. Very few people really have such black and white opinions, there usually is a sliding scale of preference when asked about just about any issue. But when pressed to give an either / or answer for a poll, they're going to say something. And the answer won't often be a very good snapshot of how they really feel.

Interviewer bias and limited available answers are just the tip of the iceberg when it comes to problems with the unreliability of poll results. Even if the questions were perfectly without bias, the respondents might not understand what they're being asked, tainting their answer. Additionally, people who analyze the poll result data could have their own bias when they interpret it. There can also be errors with collecting the data, which may also skew its interpretation. Probably most obvious is the fact that polls are often conducted on small samples of people, usually around 1,000 - they could hardly be representative of entire populations.

But an article by Mark Blumenthal at Pollster.com illustrates a good way to try to take the temperature of a given population on complex subjects or really any subjects - as long as its popularly discussed. Speaking about conducting polls on the Public Option being debated in a national health care overhaul, he says:
When it comes to testing reactions to complex policy proposals, I would rather have 10 pollsters asking slightly different questions and allowing us to compare and contrast their results than trying to settle on a single "perfect question" that somehow captures the "truth" of public opinion. On an issue as complicated and poorly understood as "public option," that sort of polling perfection is neither attainable nor desirable. In this case, public opinion does not boil down to a single number.
Basically, by comparing many polls, including maybe some that are biased or contain bad information, we could possibly come up with an "average" of how people feel about a given subject.There's promise in that concept.

The idea that polls aren't very reliable isn't a new one. While doing some research for this post, I came across a newspaper article from 1982 that talks about how different polls can come up with different answers. The article also illustrates another problem with poll reliability.

It discusses two public opinion polls about an unpopular Reagan-era tax increase. They seemed to indicate that the tax measure actually became even more unpopular after Reagan spoke out in support of it. Did the results mean his speech had the paradoxical effect of turning the public against the bill he was supporting? Not necessarily. Evans Witt explains that the two polls used completely different questions that emphasized different aspects of the complex tax bill:
The AP-NBC News poll mentioned the bill's role in cutting the federal deficit and outlined briefly its major provisions. A series of questions in the Post-ABC Poll emphasized Reagan's support for the bill, raised the question of the fairness of the measure and probed whether the bill would cut the deficit.
With that sort of digging and innuendo, it's not surprising more people were against it after Reagan was for it. In addition, the AP-NBC News poll determined first whether people had heard of the tax bill before asking their opinion on the issue, a technique called "screening." Wow. I bet simply asking if people are aware of the issue their being asked about could make a big difference in today's polls, too.

When it gets right down to it, unless a poll is about solid facts - such as the U.S. Census population count that's about to get underway - they are completely unreliable. Individual opinion polls by themselves probably shouldn't be trusted. Even the results of a few of them lumped together are suspect, but it's probably a closer representation of opinion. The next time you hear that 60% of Americans are for or against anything, be sure to look at the results with a healthy dose of skepticism. And as always, look at the source. In all likelihood, the pollsters have something to win by swaying or muddling your opinion.

1 comment:

  1. Would like to hear what you had to say about designing a poll constructively. Are there roles or functions that polls may be used for? How would you design them?

    For instance, using a poll to indicate support for selecting study books, or indicating what people might want to contribute to a project, etc.

    Sincerely interested to hear your observations.

    ReplyDelete