« A threat to the entire world. » This is how Margaret Chan, Director General of the World Health Organization has characterized the MERS coronavirus. Until this year, the virus had been mostly focused in Saudi Arabia, where it was first reported in 2012. Recently, however, the virus traveled to South Korea, where it sparked the largest-ever outbreak outside the Middle East, infecting more than 180 people, killing 33, and sending thousands into quarantine.

Even in the epicentres of current MERS outbreaks, people are less likely to die from the disease than in car accidents. Yet most people feel safe driving a car because it’s something they have normalized and can control. But amidst saturation media coverage and intense social media discussion, fear spread across South Korea, leading to unnecessary school closures across the country and a surge in face mask sales.

The communication challenge facing South Korea’s Health Minister Moon Hyungpyo played out during a May 31 press conference in Seoul. As cases continued to climb and fear of the disease threatened to envelop the nation, the Minister explained that the key to managing the threat would be in preventing new infections, and required all hands on deck, everyone working together. The Minister made a direct plea: « In order to [stop this outbreak], we need cooperation from all health care workers and citizens ».

A crucial message, promoting a shared, collective response. But questions from the media and other observers quickly focused on exactly which health facilities were implicated and the likelihood of further disease spread. In an era of heightened demand for transparency, active listening and two-way communication of disease risk, these are questions Ministry officials would have surely anticipated, and with which they should have complied. Yet, in this case the Health Minister explained that established protocols meant he was unable to identify the affected facilities in order to avoid social stigmatization and undue financial loss.

On the one hand, officials were calling for everyone to pull together to manage the risk; yet on the other hand, officials were concealing the very information which would help them do so under the auspices of protecting the public. This represented a stunning inconsistency in messaging and threatened to undermine an already fragile public trust and accelerated so-called MERS ghost stories and generated accusations of a cover-up. The Government eventually reversed its decision, but the damage was done. To be fair, however, the stigma dilemma is hardly South Korea’s alone.

Stigma is real and in cases of disease risk it can pose a serious threat, not just to individuals, but to public health as well. As the sociologist Erving Goffman has argued, the experience of discrimination associated with stigmatizing practices can produce intense psychological distress. The anti-social, mean-spirited and counter-productive behaviours that were associated with Ebola in Western Africa, for example, in which family members hid their sick relatives to avoid being ostracized by the community, represents a recent and intense example of stigma during high risk events. It’s endemic in situations serious enough to produce real fear and to generate psychological effects and increase risk to population health.

But if a situation is serious enough, the key to its management lies in achieving the risk management ideal — everyone working together towards a common goal, as suggested by the Korean official response. However, for this to work, risk communication depends crucially on the coordination of information, and the key to coordination is trust. This means that open sharing of information – what is known, what is uncertain, and on what basis decisions are being made – has to be seen as a vital element of public health practice and risk management. The kind of real time transparency that South Korean officials seemed unable to put into practice is necessary for response ownership, and for promoting public trust and positive behavioural change.

So authorities have a duty to try and avoid threat related stigma through information disclosure, but they also have a response imperative to ensure open and transparent communication. The dilemma lies in determining how to find the balance, and which of these takes priority. Like so many aspects of the practical challenge of communicating risk, the answer is: it depends.

It depends on the assessed level of risk, it depends on the assessed level of risk perception, but it ultimately depends on the adaptive skills of organization leaders, knowing when the transparency imperative supersedes the threat of stigma. It is thus both a strategic and an ethical dilemma: when does the need to protect the many override the need to protect the few? When does the public’s « right to know » override other institutional needs and imperatives? Like most ethical dilemmas, the answers don’t come easily.

When the social, economic and political stakes are high, and circumstances are rapidly evolving, decision makers invariably find making the right risk communication choices incredibly challenging. One simple preparedness step is to adopt a basic risk communication algorithm to help guide decision-making groups.

In the event of high risk or high risk perception scenarios, decision makers should ask themselves the following questions in deciding whether or not to release a given piece of information:

  1. Is the information needed to allow people to protect themselves?

If the answer is yes, that information must be released.

If the answer is no, then ask:

  1. Is the information needed to maintain trust, for example, explaining the basis for decision making?

If the answer is yes, that information should be released.

If the answer is no, then ask:

  1. Is there a compelling reason – such as potential stigmatization or to maintain the integrity of a criminal investigation — to withhold this information?

If the answer is yes, then there may be justification for withholding the information, but if at any point circumstances change, ensuring people can protect themselves and maintaining trust has to take priority.

In the world of high risk event preparedness, dominated by a culture searching for ways to minimize indecision, « it depends » could be seen as providing too much analytical noise to inform decisions that often require immediate and deliberate attention. But if success in managing the stigma dilemma ultimately rests on the judgement of decision makers, adopting a basic risk communication algorithm may not provide the right risk communication answers, but it will force consideration of the right risk communication questions.

Photo by Republic of Korea / CC BY-SA 2.0 / modified from original


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

John Rainford
John Rainford is the Director of The Warning Project. He is the former Director, Emergency and Risk Communications for Health Canada and Global Project Lead, Risk Communication Capacity Building for the World Health Organization.

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un périodique imprimé, sous licence Creative Commons Attribution.

Creative Commons License