People are generally bad at understanding what’s a threat to them. We have lots of research and examples to back that up: people are scared of nuclear power even though it’s statistically much, much safer than energy from coal; or there’s the classic fear of a shark attack — something that’s much less likely to kill you than, say, a vending machine tipping over and crushing you.

With the latest coronavirus outbreak, it looks like we’re poised to repeat the same mistakes we made with Ebola, Zika, SARS and pretty much every other high-profile disease outbreak in recent history. We’re going to panic and have counterproductive reactions like pushing for border closures, which don’t actually work or can even backfire. Public anxiety fuelled by misinformation can lead to the implementation of these kinds of harmful policies, and misinformation about diseases spreads rapidly online: during the 2014 Ebola outbreak, we saw that large segments of the public believed the disease could go airborne at any moment, and in 2016, many thought the Zika virus outbreak was caused by genetically modified mosquitoes.

The problem is that we’re really ill equipped to deal with this kind of misinformation. How can institutions and individuals work toward more productive responses to threats like the new coronavirus outbreak? What does the research on risk perception and communication suggest?

Past research has suggested that just being exposed to the correct information doesn’t always do the trick. Once someone has made their mind up about something, it’s difficult to change it because “making up our minds” alters the way we process subsequent information. Something confirms your pre-existing beliefs? Seems credible. Something contradicts your stance? It’s inaccurate or untrustworthy. We’ve seen this play out in the long struggle to educate anti-vaxxers about the evidence supporting the efficacy and safety of vaccines. And you probably have lots of examples from everyday life: that stubborn uncle you argue with at Thanksgiving dinner, or that YouTube commenter who insists that NASA fabricated the images of the globe from space and we live on a flat disc flying through the universe.

Part of this mindset has an evolutionary basis. We evolved to spot patterns in data: recognizing the rustle in the tall grass as the presence of a lurking predator, or spotting the tracks of wild game. And we know that evolution doesn’t carve out traits precisely, like a surgeon with a scalpel — it’s more like pulling the lever on a genetic slot machine billions of times, and the things that “work” get passed on because that ancestor was perhaps a bit less likely to die. In other words, natural selection is kind of like the million monkeys on a million typewriters that will eventually, given enough time, produce Shakespeare. Evolution isn’t forward-thinking; it’s the result of random mutations that might lead to beneficial or maladaptive traits, which then influence chances of survival, and then in turn those traits get passed on (or they don’t).

So we don’t have a finely tuned Spidey sense, like Peter Parker’s, that warns us of danger. We have a pretty rough system that is prone to false positives, because it tends to err on the side of spotting patterns that aren’t there, rather than failing to recognize a pattern. For our ancestors, it was better to think they saw a predator that wasn’t there (and waste a bit of energy running away) than to not notice one and get killed.

And now here we are, thousands and thousands of years later, with our not-so-finely tuned danger sense constantly leading us astray. Research shows that we make intuitive judgments about risk. But how are people supposed to intuitively understand the risk of a nuclear power plant? Or the risk of a novel coronavirus?

For one thing, we need to grapple with how complicated and fractured our media landscape has become. Some researchers have a tendency to wag their fingers at the “sensationalist news media” that likes to hype things up to sell papers and attract clicks and shares. But there is no all-encompassing news media, no single evil corporation that meets in a secret boardroom and coordinates how to make people panic while increasing its readership through clickbait. Instead, we have a spectrum of platforms and sources that range from credible and informative to intentionally fear-mongering — and some people never even see the credible stuff. They follow alt-right and conspiracy-minded websites and Twitter personalities. Although people have the tools to be more interconnected than ever, we have “filter bubbles” instead, where many of us see a limited slice of content that aligns with our pre-existing beliefs.

Remember how the Public Health Agency of Canada was caught lying about SARS back in 2003, and the virus was actually more easily spread than it at first admitted? Well, that didn’t really happen.

And it’s not just the “unhinged” Alex Jones fans who are buying into this stuff. Research has suggested that we’re also not as good at spotting fake news as we’d like to think, and being exposed to fake news can even create false memories. Remember how the Public Health Agency of Canada was caught lying about SARS back in 2003, and the virus was actually more easily spread than it at first admitted? Well, that didn’t really happen. But if someone sees that headline, there’s a good chance they’ll think it did.

This misinformation can have terrible consequences. With past outbreaks like Ebola, health care workers were attacked and killed; here in Canada, we saw a flood of fear mongering about immigrants and travellers carrying the disease across our borders. We’re seeing this kind of xenophobia spread with the latest coronavirus, an outbreak of fear and racism that’s in some ways more dangerous than the virus itself.

So how do we handle this mess?

First, health authorities need to use risk communication principles like “anticipatory guidance,” which means being as upfront and transparent as possible and warning people about potential risks and bumps in the road. Officials should be honest about what we don’t know and also tell people that new cases of coronavirus are likely to continue cropping up in new countries. If people can brace themselves for things like that before they happen, they’re less likely to panic when they do happen, because this approach reinforces the idea that health authorities are on top of the situation.

As individuals, we need to keep things in perspective. Seasonal flu killed 80,000 people in the US last year. Each year in the US, at least 2.8 million people get an antibiotic-resistant infection, and more than 35,000 people die. In Canada, the death rate for influenza is 500 to 1,500 cases per year, and antimicrobial-resistant infections contributed to over 14,000 deaths in 2018. We tend to focus on high-profile, novel threats and ignore things that seem more mundane — it’s why someone living in North America might be afraid of a terrorist attack and yet hop into a tanning booth without a second thought. Some research has suggested that just knowing about these psychological quirks of ours when it comes to risk perception, and being aware of the persuasion strategies used by sources of misinformation, can be enough to help us see through the lies or sensationalism.

In the longer term, more hospital beds and legislation that ensures paid sick days can help lessen the impact of a contagious disease outbreak. But in the meantime, communication strategies can help mitigate unproductive panic. There should be direct dialogue about why border closures don’t work, pointing to the extensive evidence and examples of past outbreaks where such measures were ineffective or even placed the international community at greater risk. Xenophobic reactions and racist fear mongering must be addressed head-on, with messages that aim to defuse these kinds of sentiments and tensions.

Since people often assess risk in an emotional, irrational manner, and since they often do not trust health authorities, how can there be any hope of these messages affecting perceptions in a positive way? One way to improve their impact is to tailor them to different groups, addressing particular risk perceptions and information needs that may vary among concerned parents, front-line health care workers and journalists.

Perhaps most important, as we’ve learned from the anti-vaxxer movement, we need to move beyond the “information deficit” approach — which assumes that people are just lacking facts and information and that more communication will lead to positive change — and address the underlying trust gap.

Some research has found that short workshops on science and risk communication can improve how journalists relay information about uncertainty and relative risk. Governments and health authorities should be trying to work closely with prominent media voices and platforms to help guide their reporting toward being more informative and less sensational. As we saw with Ebola, even news coverage that wasn’t technically inaccurate could still lead to misplaced public fears if information wasn’t put in context: for example, by noting the extremely low chance of Ebola going airborne.

Perhaps most important, as we’ve learned from the anti-vaxxer movement, we need to move beyond the “information deficit” approach — which assumes that people are just lacking facts and information and that more communication will lead to positive change — and address the underlying trust gap. This means identifying and reaching out to trusted community leaders (pastors, community health workers and so on) and getting them to help spread the message. Online, this strategy can also help punch through filter bubbles. Some of the tweets from the US Centers for Disease Control and Prevention about Ebola were retweeted by various influencers and celebrities; these retweets were actually shared more widely than the original messages.

And don’t forget to wash your hands. Movies and TV shows about disease outbreaks tend to show scientists dramatically racing for a cure to save a world on the brink of destruction. Sure, it wouldn’t make for a very compelling movie to show the development of a public service announcement about the importance of hand hygiene, even if it were to star Brad Pitt. But that’s the kind of thing that stops diseases from spreading and lowers the individual risk of getting sick.

It’s crucial to get plenty of such solid and accurate messages out there — and make sure people are listening.

Photo: Shutterstock by Antonio Rico


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Scott Mitchell
Scott Mitchell is a PhD candidate and instructor at Carleton University. His research focuses on risk communication and public understandings of science.

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un périodique imprimé, sous licence Creative Commons Attribution.

Creative Commons License