While critics have admonished pollsters and the press for disseminating inaccurate portrayals of public opinion on important national issues, they have generally ignored perhaps the most serious and intractable problem in this endeavour — a poorly informed American public.
Polling ill-informed Americans
Poll after poll has shown that on complex issues like health care reform and raising the debt ceiling, many Americans freely admit that they have insufficient understanding of the issues. Estimates of the number of ill-informed range from about 40 per cent to over 70 per cent of the population.
Their opinions, because they are unformed, are easily manipulated. Whatever views they do hold are ephemeral and subject to change with new information. The evidence suggests the magnitude of this problem readily dwarfs statistical uncertainties of polls as characterized by confidence intervals (e.g., this percentage is accurate to ± 2.5 per cent 19 times out of 20).
And yet pollsters persist in asking their opinions regardless.
Why are pollsters so persistent? Why don’t they simply classify these respondents as having ‘no opinion’?
Pollsters need answers
Because when pollsters are commissioned to poll public opinion (often by press organizations), they need to deliver the goods. Reporting that half the public has no idea about the particulars of health care reform or the debt ceiling is not the headline the press has paid for.
This leaves pollsters with a dilemma. How do they get respondents to proffer an opinion on some topic when they really haven’t thought much about it?
Use of leading questions
They do what good litigators do. They ask leading questions where the answer to the question is embedded in the wording of the question.
How do they manage that?
The leading or suggestive wording comes from an intensive and broad-based media coverage of the political debate in Washington on the topic in question. Media reports, especially on TV, still carry a good deal of cachet among Americans, and the political messages they carry from the partisan debate in Congress tend to find their mark among the public.
The media bombardment conditions the ill-informed Americans to associate ideological tautologies that flow out of the partisan debate.
So for example, a Republican-leaning respondent associates “raising the debt ceiling” with “oppose”, and “increased debt” and “spending cuts” if the legislation is to pass. A Democratic-leaning respondent associates “raising the debt ceiling” with “support”, and “government default” and “financial chaos” if it does not happen. For respondents who know next to nothing on this complex topic, that’s all they need to know to answer the poll questions.
Leading questions mislead
But clearly these simplistic associations trivialize the complexity of the issue. The impact of spending cuts would increase what is an already high unemployment rate. Most Americans, be they Democrats or Republicans, regard the high unemployment as far more serious and immediate than debt reduction. How would that influence their support or opposition to raising the debt ceiling?
Yet another complication stems from the historical perspective. There have been many previous debt ceiling increases, including seven during the previous Bush presidency. These were never linked to legislated spending cuts. So why now?
For respondents who know little about the subject matter, finding out about the economic impact or historical pattern of debt ceiling legislation could have significantly influenced them in favor of raising the ceiling.
Explanatory preambles
It’s really important that when polling a complex national issue, pollsters include questions with a preamble explaining some aspect of the complexity, and then ask respondents whether they favor or oppose the issue being polled. These preambles can significantly alter the level of public support for a complex issue that for many is opaque.
During the health care debate some polls found that using this type of preamble question increased public support for Mr. Obama’s healthcare reforms. According to a NBC/Wall Street Journal poll, without a preamble explaining the reforms, only 36 per cent of Americans felt Mr. Obama’s plan was a good idea. With a preamble, 56 per cent said they were in favor of it.
In spite of these merits, these questions are not popular with pollsters because the extra verbiage makes them more demanding for respondents and can impact on survey response rates. Also, pollsters have to be careful in the wording of the preamble so as not to bias the response to favor or oppose the issue being polled.
Open-ended questions
Another strategy to consider when polling complex issues is to simply abdicate the traditional structure of leading questions and pre-formatted responses, and rely on open-ended questions that do not use leading phraseology to prompt answers. In effect, let respondents answer in their own words. This approach was taken by Gallup in one of their surveys. It yielded results that differed markedly from polls using preformatted responses.
For example, on the provision in the health care reform legislation requiring that all Americans who did not have health insurance to get it, a CNN poll using pre-formatted responses showed that 53 per cent of the public were opposed to it. Without prompting, the Gallup survey revealed that only 5 per cent were opposed to this provision.
Pollsters and the press do not favor these open questions for a number of reasons. One has to do with the logistics of converting polling data into news stories. Each response has to be manually categorized and coded. For today’s instantaneous news cycle this process is simply too slow and cumbersome.
Secondly, the coding scheme is subject to some degree of uncertainty due to coding decisions made by humans. However, these errors can be kept to a minimum through independent coding verification.
Thirdly and perhaps most importantly, it robs the press of its headlines. Without recourse to the prompting effect of pre-formatted responses, the open-ended responses tend to be scattered across a number of distinct categories, making it difficult to summarize them in a simple, catchy headline.
Focusing on informed respondents
Yet another approach might be to simply compare poll findings for those who admit they are ill-informed with those who claim they are sufficiently informed. The problem is that regardless of whether there are differences or similarities between the two groups, these results may be inconclusive due to the influence of other variables and factors. A more complex statistical analysis would need to be undertaken. There is also the problem of respondents’ self-assessment of being informed or not. One person’s assessment of being well-informed could be another’s of being ill-informed.
A more relevant and accurate comparison might be one based on behaviour — whether a respondent votes in elections. Comparisons between voters and nonvoters can test the hypothesis that those who vote have a greater familiarity with important national issues. Comparisons can also reveal if support for an issue is substantially different between voters and nonvoters. The rationale behind this approach is similar to that of polls predicting election results by trying to identify those most likely to vote on election day.
To put it another way, if a segment of the population has such a low sense of civic responsibility that they don’t vote, does it really matter what their opinion is on some complex national issue? Ultimately the political establishment will be judged on their performance by those who vote, not those who don’t. That said, it makes sense that the press should focus on public opinion that is politically meaningful, rather than confound it with possibly very different opinion of those who don’t vote and politically are not players.
Sacrificing accuracy for excitement and bottom line
All these strategies place a greater responsibility on the press and pollsters to do a much better job of identifying legitimate public opinion, particularly when a substantial segment of the population professes ignorance of a complex national issue. Rather than cherry pick questions that create an exciting narrative and coincidently helps sell papers, the press needs to make sure it has its public opinion facts right. The latter should also be the focus for pollsters. Instead their focus is on technological innovations like robo calling and flash internet surveys from respondent pools, the primary purpose of which seems to be to improve their bottom line.
The current state of affairs in polling public opinion in America is a disservice to the country. Ill-informed Americans are being influenced by self serving media stories and simplistic, suggestive polling questions to profer answers that do not reflect their true opinion on the issues. The proliferation of inaccurate public opinion is destructive to national dialogue. Ultimately, it isolates Americans from each other, advancing social division over common purpose.
This article was originally published by iPolitics on October 26, 2011.
3 Responses to Pollsters use leading questions to manipulate the uninformed