Depending on how you look at it, we're roughly six months or 60 years into the debate over whether and how the government should ensure universal health care for all Americans. And yet if there's one thing polling on the public's opinions about health care makes clear, it's that people are confused, holding a disparate mix of often contradictory views and frequently clinging to incorrect beliefs.
For reporters, there is a clear lesson in this: Put the polls down. Just walk away. Pay them no attention. Pretend they don't exist.
For one thing, whatever you think they mean, there's plenty of evidence to support the opposite interpretation.
For another, there just isn't anything particularly noteworthy in the results. People favor significant reform, think all Americans should have coverage, are concerned about how much it will cost, worry that change could make their own situation worse -- is any of this really surprising? Does anyone actually need a poll to tell them these things?
Put another way: When is the last time you saw a truly surprising poll result? When is the last time you saw poll data that showed that people don't care whether others have health insurance and don't think the government should have any role in health care whatsoever? Or the last time you saw a poll that found people were willing to pay higher taxes and lose the ability to choose their own doctors and cede health care decisions to the government if that's what it takes to get coverage for their neighbors?
Not only is the overwhelming majority of health care polling unsurprising, much of it is essentially meaningless. Take the oft-asked question of whether people approve of President Obama's handling of health care. That's a question the media love to tout -- but what does it mean? Basically nothing. If 60 percent disapprove, what does that tell us? Without knowing how many disapprove because they don't think the government should be involved in health care and how many disapprove because they think Obama should have won passage of a public option by now, the result doesn't tell us anything. Likewise, if 60 percent approve, we don't know why. Is it his emphasis on bipartisanship? His deference to Congress? His advocacy for universal coverage and a public option?
Then there's the truly meaningless. Take a look at this question from a new CNN poll:
If your member of Congress came to your community and held a town hall meeting or some other public forum where voters got a chance to speak, how likely is it that you would attend that event to tell your member of Congress what you think about health care? Would you be very likely, somewhat likely, not very likely, or not likely at all to do that?
Forty-one percent said "very likely" and 30 percent said "somewhat likely," and that doesn't tell us anything. Why not? Because we have nothing to compare it to. CNN has apparently never asked a question like this before -- about health care or any other issue. So we don't know whether those numbers are high or low; we don't know what the baseline is.
My suspicion is that if you asked people five questions in a row about, say, education -- an issue that hasn't gotten much attention in quite a while -- and then asked them if they would take the opportunity to tell their member of Congress what they think about education, a large number of respondents would answer affirmatively.
Here's an illustration of the importance of having points of comparison for poll data like this: In February, a CNN poll asked respondents how important it was for the president and Congress to deal with several issues. Eight-one percent said it was "extremely" or "very" important that they deal with education. Wow, 81 percent! That's huge, right? Well, no. The economy came in at 95 percent, terrorism at 82 percent, health care at 77 percent, Social Security and Medicare at 83 percent, taxes, 76 percent, Iraq, 75 percent, Afghanistan, 76 percent, energy policy, 73 percent ... you get the point.
So when a CNN poll finds that 71 percent of Americans say they're likely to attend a town hall meeting to tell their members of Congress what they think about health care but provides absolutely no other data to measure that result against, it doesn't really have much value at all. It tells us next to nothing.
Now add in the fact that it doesn't tell us how many of those 71 percent want to tell their member of Congress to stop screwing around and pass a public plan, and how many want to tell their member of Congress to keep the government's hands off their health care. It's pretty clear now that that 71 percent figure means much less than it seems, isn't it?
In fact, it means so little that I have a hard time believing it was actually intended to measure anything important. I suspect the sole reason it was included in the poll was so that CNN could include the result in their news reports about angry town hall attendees -- not because they thought it would actually be illuminative. It isn't compelling information; it's a prop.
Speaking about angry town hall attendees: Ignore them, too. A dozen people shouting at a town hall meeting -- even a dozen people shouting at each of a hundred town hall meetings -- just doesn't tell us anything meaningful about public opinion. It tells us that there are at least few thousand angry people, and that they're organized. We already know that.
Look: Sarah Palin drew big crowds last year -- and a lot of those people were angry. They yelled, they held up nasty signs, and they convinced a lot of the media there was some huge groundswell of opposition to Barack Obama. Then he went out and won North Carolina and Indiana.
1 | 2