In 2019, an election year federally and in Alberta, what if we could all resolve to be better consumers of online content? Could we train to become sommeliers of social media news shares, carefully discerning the provenance of a piece and determining whether it is authentic or just swill — or even poison?

The implications of not taking more care about what we absorb and then distribute online can’t be overstated. Information is being used as a weapon not just against parties and politicians but also against our sense of trust in institutions and our social harmony.

A report by the Oxford Internet Institute’s Computational Propaganda Research project, released in mid-December, said Russia used social media posts to suppress the African-American and Hispanic vote during the 2016 American elections. Russia’s Internet Research Agency used the segmentation of advertising markets offered by social media platforms such as Facebook to tell those voters that they should boycott the election. It encouraged right-wing voters to vote for Trump and shared posts with them designed to kick up anger around minorities and immigration. The Russians also sought to drive a wedge among liberal voters, trying to “reduce trust in the political system.”

The Russian attack also spread “sensationalist, conspiratorial, and other forms of junk political news and misinformation.” If that doesn’t make you queasy enough, the Russians didn’t stop their online activity once they had been caught.

If you’re still not convinced that these campaigns are a threat in gentle and polite Canada, consider that during the 2018 Swedish election, 22 percent of news content shared online with political hashtags was “junk news,” defined as deliberately misleading, deceptive or incorrect information.

In Mexico, the team of journalists behind Verificado monitored the misinformation that was circulated during that country’s recent presidential election on popular social media platforms such as WhatsApp, Facebook and Twitter and also shared as news items. Some of the cases were straight-out false stories about candidates, others misrepresentations of photographs. In one case, a video was manipulated and then presented on social media as evidence that presidential candidate Andrés Manuel López Obrador (now the President) refused an interview because he was drunk. The majority of the misinformation in Mexico came from domestic actors, and not from Russia, analysts have suggested.

These acts are not harmless. Beyond potentially swaying the results of an election, and poisoning our democratic process, they can also create dangerous tears in our social fabric. The idea is to polarize us and make us angry and distrustful.

“With all of the attention to ‘Nothing is true, and nothing is real, and everything is biased,’…our worry is that we’re shifting to an ‘I don’t believe anything’ culture,” says Kathryn Ann Hill, executive director of MediaSmarts, a not-for-profit organization that promotes digital and media literacy. “That’s not a good thing because it’s a clear road to apathy, feelings of a lack of ability to have any investment in our political system or our electoral system — it’s a bad thing for democracy.”

Sure, we can look to our leaders and public servants to do something about this. Elections Canada, for example, has said it will be using artificial intelligence to try to stamp out as much disinformation about the electoral process as possible. The agency is also consulting with other countries to find out what they are doing. France passed a law against misinformation this past summer that would allow content to be removed from the Internet after a quick judicial review. The legislation has been criticized as infringing on free speech.

The Public Policy Forum, in an August 2018 report on disinformation, recommended the creation of a “nimble organization outside of government for ongoing and long-term monitoring, research and policy development” around the issue. It also called for a legal requirement that all digital producers and disseminators of content identify themselves and their beneficial owners clearly on their platforms.

But we as citizens also have an important role to play. If only we could regard the triage of online content as something we do as routinely as separating the plastics from the paper for recycling.

It’s not going to be easy. A recent study published in the journal Intelligence linked susceptibility to misinformation to cognitive ability — something that wanes as we get older. In a December 2016 survey by the Pew Research Center, 23 percent of respondents said they had shared misinformation online, either deliberately or unwittingly.

Political scientist Thierry Giasson, the lead researcher at the Research Group on Political Communication at Université Laval, recently convened experts on media education at a conference in Montreal. The goal was to answer some key questions about news literacy, media education and citizenship, and ultimately to produce a white paper for the Quebec government on expanding media literacy into the curriculum as a stand-alone area of instruction. The Canadian experts brought together for the conference hope to create a network that is focused on the issue.

Giasson points to the “30 seconds” campaign by the Fédération Professionnelle des Journalistes du Québec, which urges people to take 30 seconds to read a piece of online content before sharing it. “Look at the source: where is this coming from? Usually a source is clearly identified. Is it a legitimate news organization?” says Giasson.

“You need to check it out before you share it. Where is the link taking you? Is it taking you to the original source or to another website? If you doubt for a single second that it’s not legitimate, don’t share it.”

MediaSmarts has developed a range of resources for the public and for educators on authenticating information online. Says Hill, “Check the original source. Don’t assume it’s true because a lot of people shared it, or it’s going viral on social media…or it’s the first result that came up in your search engine. People assume that’s a ranking, and it’s not.”

Plenty of us feel indignant when we get the calls from the telephone scam artists claiming to work for the bank or Windows or the Canada Revenue Agency. How dare they try to pull one over on me! But we’re not angry or smart enough yet about the foreign and domestic players who are trying to distort our democratic process and just make everything we trust feel wobbly.

Our New Year’s resolution as citizens should be to declare ourselves the first line of defence against the weaponization of lies.

Photo: Shutterstock/By Sergey Nivens


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Jennifer Ditchburn
Jennifer Ditchburn est présidente et chef de la direction de l’Institut de recherche en politiques publiques. Entre 2016-2021, elle était rédactrice en chef d’Options politiques, l’influent magazine numérique de l’IRPP. Jennifer a travaillé pendant plus de 20 ans comme reporter nationale à La Presse canadienne ainsi qu’à SRC/CBC. Elle a codirigé, avec Graham Fox, l’ouvrage paru en 2016 The Harper Factor: Assessing a Prime Minister’s Policy Legacy (McGill-Queen’s).

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un périodique imprimé, sous licence Creative Commons Attribution.

Creative Commons License