The threat posed by disinformation on elections is well known. At First Draft, we have monitored false and misleading content that emerged in the lead-up to the 2016 US presidential and 2018 US midterm elections, the 2017 elections in France, the UK and Germany, the Brazilian election in 2018 and the Nigerian, EU Parliamentary and Australian elections in 2019. Each country tends to have slightly different characteristics, with varying platforms hosting the most harmful content. But the content itself is depressingly similar in theme: conspiratorial, divisive, misogynistic, filled with hate, and designed to drive down trust in the integrity of elections.

With our Brazilian election project Comprova, our biggest challenge was monitoring WhatsApp, an encrypted messaging app that is very popular in Brazil, and used to share news and information. In the US, our challenge was monitoring the anonymous message boards like Reddit, 4Chan and Discord to understand how groups coordinated their tactics to manipulate the media as well as social and search platforms. In Nigeria, our most difficult platform to review was Facebook.

Journalists in Canada should prepare for similar tactics to be deployed:

  • Plan for the types of rumours that are likely to spread around topics that are particularly polarizing like race, religion or immigration.
  • Think about where these rumours might appear: review Facebook groups related to your beat, understand how information circulates on closed messaging apps like WhatsApp and WeChat, and be sure your newsroom establishes policies around how to report on information sourced from anonymous spaces like 4chan and Discord.
  • Think about how this information travels through the information ecosystem: investigate the provenance of content found on anonymous messaging boards, review for signs of manufactured amplification, and understand message coordination by motivated communities like anti-vaxxers.
  • Know that journalists are targets. Agents of disinformation consider it a win when they successfully fool journalists into integrating false content into reporting, or by simply getting them to amplify rumours and conspiracies by repeating them “on air.” Make sure your newsroom is properly trained to identify these tactics and knows how to respond effectively.

The “trumpet of amplification” (figure 1) is designed to remind journalists of how disinformation often moves through the information ecosystem. If you see disinformation on Twitter or on the evening broadcast, it is possible that it originated on the anonymous web. The media and politicians are all vulnerable to manipulation as they can provide the reach that agents of disinformation do not have, and a successful campaign will also erode trust in those institutions.

One of the main ways journalists can keep from being manipulated is by ensuring that any rumour or piece of content is thoroughly investigated. Where did it originate? By whom? When something turns up on Instagram, Facebook and Twitter, journalists need to ask: is that where it started? Is there any evidence of it on conspiracy communities on YouTube or Reddit? Are there conversations about it on the anonymous web? Is there any evidence that this content is being amplified via coordinated networks like closed Twitter or Facebook groups or WhatsApp groups? Journalists probably won’t be able to prove coordination in their reporting, but are they seeing clues that it might be happening?

Agents of disinformation watch techniques that have worked in other countries and mimic the tactics. For example, in the UK, BBC branding was used to advertise the wrong date for the election, (figure 2). Next to that image it is one from Brazil, where the same technique was employed.

A video from May 2019 showing US House Speaker Nancy Pelosi giving a speech was slowed down and re-shared on Facebook to give the impression that she had been drinking (figure 3).

A month later, the same technique was used against Argentinian Security Minister Patricia Bullrich (figure 4).

And sometimes the very same content is used across borders. A video from Russia (figure 5), has appeared and been debunked in numerous elections across Europe, including for our CrossCheck France election project. In the video, a nurse is being assaulted in a hospital. In every country where the video has circulated, it claims to be from that country and has been used to stoke fears about immigrants and faltering government healthcare.

While more people are monitoring disinformation now, the task has become much harder as many conversations and coordination have moved into closed spaces, like private groups on Facebook, Facebook Messenger, WhatsApp and other apps like Telegram and WeChat.

Targeted advertising is a significant feature of election campaigning, particularly Facebook ads. I would recommend watching the Netflix documentary The Great Hack, which is a useful explainer of the role of micro-targeted ads and their potential impact on elections. These “dark ads” on Facebook allow campaigns to test messaging on targeted audiences without leaving any trace of the ads on their Facebook page.

One way of knowing what people are seeing is getting the public to install a browser extension that will collect ad information when they scroll through Facebook. The Globe and Mail’s Facebook Ad Collector, (a project first built by US-based news site ProPublica) is doing that in the lead-up to the election.

The other is using the Facebook Ad Library. Canada is the first country where Facebook rolled out its ad transparency tool, and now you can search its advertising database globally for ads about social issues, elections or politics. The database can be useful for tracking how candidates, parties and supporters use Facebook to micro-target voters and to test messaging strategies. Here are two examples of recent ad campaigns by The Conservative Party of Canada and The Liberal Party of Canada.

Unfortunately, it’s not possible to research what groups of people in certain constituencies are seeing in terms of advertisements. You can research by page name (Liberal Party of Canada) or a keyword (guns, for example), which is much better than when there was no Ad Library. Newsrooms should regularly monitor ads from major candidates and supporters and ads around wedge issues. I also recommend that newsrooms use the Facebook Ad Archive API to do keyword searches. Note that attempts by researchers to do this systematically during the EU election largely failed because of bugs with the system.

With fewer than six weeks to go before voting, this is the time for newsrooms to prepare for the role disinformation could play in the election. Journalists should be aware of the way they could be targeted by those trying to push false or misleading content for political gain or simply by hoaxsters trying to cause trouble for the sake of it. Newsrooms should know how to monitor trending online content related to the election. What rumours and misleading content are audiences seeing in the social spaces where they’re spending their time? They should be trained in effective verification strategies, both in terms of spotting video or image manipulation, but also in investigating the provenance of content that might have started on anonymous web platforms.

Newsrooms should decide whether they will debunk rumours as a service to their audience or avoid reporting on this type of content. They should have strategies for dealing with unplanned information crises that could impact the election. If there is a last-minute document dump, as there was two days before the French election with the #macronleaks, newsrooms should know how they would respond.

We have gathered more than enough evidence from elections around the world to know that journalists and newsrooms should prepare for disinformation aimed at disrupting democracy. Disinformation will spread ahead of the Canadian election. It’s impossible to know whether it will impact the final result, but if journalists are unprepared, there’s a much higher chance it will.

This article is part of The media and Canadian elections special feature.

Photo: Shutterstock by Sergey Nivens


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Claire Wardle
Claire Wardle leads strategy and research at First Draft. She specializes in misinformation, verification and journalism.

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un périodique imprimé, sous licence Creative Commons Attribution.

Creative Commons License