Much of what is wrong with political polling occurs at the nexus between the firms that produce the surveys and the media industry. It is worth bearing that in mind as we approach the October 2019 federal election campaign, in which opinion polling will be more central than ever to media coverage.
In Manitoba, we have a recent lesson in what can go wrong. Election day is September 10 in the province. With just over three weeks before the final vote, a company called Converso sent a press release to Winnipeg’s major media outlets. The poll had a startling finding: the NDP, it said, was running neck-and-neck with the province’s incumbent Progressive Conservatives, a political phenomenon that hadn’t been seen for many years.
Besides coming from an unfamiliar company based in Toronto, the press release did not contain any of the usual breakdown of regional results or a full explanation of its sampling process or weighting. The press release declared that the race had “tightened” to a “statistical tie” even though Converso had never done a public poll before on the Manitoba parties. Implicitly, Converso was comparing its results with those of the respected Manitoba firm, Probe, which had last released a poll in the spring showing the PCs well ahead.
Most major Manitoba media outlets, including the city’s leading newspaper, the Winnipeg Free Press, gave the poll a pass. But it was apparently an irresistible story to the Manitoba CBC, which heralded the startling result with few caveats. But within four days of Converso’s initial press release, and the acknowledgement of “weighting issues,” the firm announced the results of a recalculation: the Tories were well ahead of the NDP after all.
For its part, the CBC, which had walked back the story only slowly and incompletely as the doubts mounted, published a piece saying the polling firm had recanted. It barely mentioned its own role in propagating the erroneous information. Phone calls to CBC Manitoba seeking an explanation of its editorial process were not returned.
What happened in Manitoba is not an isolated incident. New digital and automated methodologies have radically reduced the marginal costs of polling. Survey results are often supplied by polling companies for little or nothing. That has proven irresistible to news organizations with dwindling budgets and newsrooms, many of which fail to exercise the routine journalistic scrutiny they would apply to any other information coming their way.
The tragedy is that these journalistic failures undermine the public’s understanding of the genuinely useful information the polls can provide.
Are political polls accurate?
Unlike Manitoba’s elections, which don’t get much polling attention, there is an avalanche of polling in advance of October’s federal election. The National Newswatch portal, which aggregates Canadian political news, almost daily offers a poll charting the latest see-saw of the political parties.
Yet political polling faces a crisis of credibility. I sometimes hear friends, even journalist friends, say, “I never pay any attention to the polls.” This is absurd, or at best, naïve. The narrative created by the polls—who’s rising, who’s falling, who has stalled; who’s liked, who’s disliked—gets embedded in the journalism that constitutes what all of us know about the election in invisible ways.
If Jagmeet Singh’s NDP manages to surge five or six points in the polls, assignment editors will commission personal profiles and make sure reporters follow his campaign tour daily. If his party sinks as much, and the Greens overtake the NDP, the media will leave him for dead. News stories don’t need an explicit reference to the polls to be shaped by them. Moreover, the parties are engaged in sophisticated polling which informs their decisions. In other words, even if the polls were not generally accurate, their place in the media and political ecosystem requires attention.
However, evidence suggests polls usually provide a broadly accurate picture of the electorate, even if it’s not always quite as sharply focused as we would like. In fact, despite all the changes in methodology over the last two decades and the difficulties of reaching some parts of the population—the landline-less young and the Internet-challenged old, for example — the evidence suggests polls are about as accurate now as they ever were. (And yes, pollsters do contact cellphones!)
Polls get the winner of elections right about 80 percent of the time in the United States and although less well studied in Canada, this seems about right here, too. Unfortunately, that considerable level of success leads the media and many news consumers to over-rely on them. If you bet your life on the polls, you will die about 20 percent of the time. (In comparison, if you play Russian roulette with a six-shooter, you will die just 17 percent of the time.)
Media and mistakes
Public opinion research companies rely on the income they get from business clients, governments, and political entities, including parties, whose primary interest is understanding political opinion to manage or manipulate it. Horse-race polling, which dominates modern news, is essentially an industrial by-product, which pollsters provide to the media in exchange for publicity.
Even 20 years ago, media organizations paid substantial sums for polling and could command a say in their content and interpretation. (The old Southam newspaper chain actually bought and ran its own polling firm for a while.) But shrinking newsroom budgets and new polling technologies have changed all that. Today, the standard approach to publishing these polls is often barely journalistic. In the jargon of the business, it is “rip and read:” what appears to have happened at CBC Manitoba.
Last year, I was part of a panel along with two other academics, David Zussman, now of the University of Victoria, and Christopher Adams at the University of Manitoba. Commissioned by the Marketing Research and Intelligence Association (MRIA), then the principal organization representing the industry, our task was examining a polling fiasco in the 2017 Calgary municipal elections. What we found in Calgary were, in many cases, general pathologies in the conduct and reporting of polls in Canada.
Our newsletter about the public service.
Nominated for a Digital Publishing Award.
Our observations boiled down to three things: a lack of transparency by polling firms about their methods; a lack of accountability by individual firms, as well as the industry as a whole, for the performance of the polls; and a lack of capacity, and in some cases, a lack of interest, by media to understand, interpret and accurately convey the information polls contain. That included the uncertainty they inherently bear. All of these also played a role in the Manitoba incident and need attention as we approach the federal election.
Our report provided an extensive checklist of information that we believe should be released by polling firms when a poll is published. It recommended that individual firms as well as the industry as a whole be tracked for accuracy over time, an exercise that would capture much more than the theoretical “margin of error” polls now routinely claim. And the report made a remarkably simple recommendation to the media:
News organizations should adopt as part of their journalistic codes a requirement that reporting on polls be subject to the routine critical practices of journalists. These should include an assessment of the track record of the polling company, the funding of the poll, potential conflicts of interest, a consideration of the methodology, the existence of conflicting or contrary evidence available either in other polls or through other forms of reporting. It is important that no exception to these critical practices be made in the case of exclusively obtained polls.
In other words, do your job.
As the report suggested, the issues are often the most acute when a news outlet gets exclusive access to a poll, which it can treat as a kind of “scoop.” When was the last time you saw a media story including exclusively obtained polling data which raised any questions about the reliability of the polling firm or its methods, or its consistency with other available information? I don’t believe I have ever read a reference in the media to a polling firm’s client list, for example, which might speak to potential conflicts of interest. Often polls are attractive to media organizations precisely because they seem to offer a tantalizingly effortless story that obviates the normally laborious processes of critical journalism.
The media need to help voters make sense of polling, sift through the different methodologies and come to terms with the meaning of inherently uncertain statistical information that most voters—and, truth be told—most journalists, struggle to understand and digest.
Polls should never be published without context. Polling firms deeply resent the work of polling aggregators such as Nate Silver’s Fivethirtyeight.com in the United States and CBC’s Poll Tracker here in Canada because they are, in a sense, parasitical, relying on the fruits of polling firms’ work while bearing none of their costs. But polling aggregates are more reliable prognosticators that the work of individual firms. While an individual poll may prove to be closer to the mark than any aggregate once ballots are cast, it is difficult to predict in advance which poll that might be. A polling firm that is extremely accurate in one election may be less so in the next.
An additional problem in reporting polls during elections arises from a timing issue. Shifts in sentiments during election campaigns can be swift, while media reaction is slower. Polls are, by their nature, backward-looking. The poll you see online this morning is likely at least 36 hours old, and some if its respondents may have been answering the questions four or five days earlier. If the results are interesting, it will take at least another day or two for them to be digested and regurgitated by columnists and only then do they start seeping into the stories commissioned by assignment editors.
This time lag is significant because politically, Canada is less like the US, where partisanship is extremely high, and more like Europe, where voter volatility has dramatically risen. Emmanuel Macron swept the French presidency and won a parliamentary majority in 2017 with a party that had not existed in the previous election. Something similar happened with the Brexit Party in the United Kingdom at the recent European elections.
In the last 30 years, no fewer than eight political parties have been represented in federal leaders’ debates in Canada. In the 2011 election, Jack Layton’s NDP campaign was viewed as moribund in the early weeks, and the so-called Orange Wave surfaced in the polls with just two weeks left. In 2015, Justin Trudeau’s Liberals began the campaign in third place. In fact, when the polls are collectively wrong, it is often because, as in Alberta in 2012 when the polls predicted a win by the Wildrose Party, there is a late shift in sentiment
But this very volatility is paradoxically what can make the polls interesting and useful. Nowadays, the parties have much more granular data about voters than published polls provide. Nonetheless, seeing through the polls that a party is sagging in Quebec or British Columbia, or breaking through among women or rural voters, helps us make sense of their campaigns, their advertising and their policies. Polls are an analytical tool that enables those of us in the media, academia, or simply the interested electorate to make sense of party strategies. More significantly, in a universe of multiple parties, voters use the polls as guides to achieve the ends they seek through strategic (or more properly, tactical) voting.
The polls are obviously an imperfect vessel for informing such important decisions. But the alternative in an era where the parties are micro-targeting them is for voters to be left unarmed in making their most powerful democratic choice. Journalists, however, should stop treating horse-race polls as “easy copy”— stories that can be thrown up on a website without much time, effort or thought. Reporting polls well — thinking critically about them and communicating the signals they contain in a language the public can understand — is at least as challenging as any other form of political journalism.
This article is part of The media and Canadian elections special feature.
Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.