One of the best-known Greek myths tells the story of Cassandra, Princess of Troy, who was granted the power to foretell the future by the god Apollo as a gift in exchange for her love. Apollo later turned against Cassandra when she refused to comply with his advances, and as punishment for her refusal, he ensured that none of her prophecies would be heard. When she alerted the Trojans that the Greek army was hidden in the wooden horse, nobody listened. Her warning that Troy faced imminent ruin, while accurate, was all but ignored. The outcome is well known.

The Cassandra Myth has long been used to highlight the communication challenges facing the emergency response community. In times of disaster, authorities must express the urgency of a situation in clear and effective terms so that citizens and involved organizations can take action to reduce harm. Yet the degree to which such warnings will be heard, believed and acted upon depends in large part on the credibility of the source. If the public does not consider the source to be reliable, the warnings, however accurate, will go unheard. This ongoing challenge — alerting to threat, while maintaining credibility — was laid bare during the recent false emergency warning in Hawaii, when the state’s emergency broadcast system mistakenly sent the following alert shortly after 8 a.m. on Saturday, January 13: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.”

It would take more than 30 minutes for authorities to issue a correction.

The false alarm dominated the international news for days and generated an astounding volume of first-person accounts from citizens contemplating their imminent doom. People texted loved ones, others prayed, many helped children into storm drains to seek shelter. A story published in the Toronto Star recounted how some Canadian tourists responded to the scare:

They had 10 minutes to live.

So they used it well: they dressed, packed up some water and their medications, sent “I love you” texts to bewildered family back home…They hugged each other and drank two shots of vodka.

Then they waited for the ballistic missile to hit.

Security camera footage released later in the day showed images of university students racing for shelter in the seconds following the alert. Other footage provided similar pictures of panic-stricken tourists and surfers running from the beaches. Hawaii State Representative Matt LoPresti would recount his own experience to CNN, his voice still shaking: “I was sitting in the bathtub with my children saying our prayers
‘Do you remember where Daddy put all the emergency supplies?’”

The narrative of imminent disaster recalls 1950s bomb shelters and doomsday preppers in the Tennessee hills, and reminds us that we still rely on our governments to alert us to threat.

Once the false alarm was over, senior officials swiftly organized a series of news conferences, empathizing with citizens about the emotional toll of the error and promising to hold those responsible to account. Some leaders went further and noted that such mistakes undermine the effectiveness of any future warnings.

Indeed, the “cry wolf” phenomenon should not be dismissed. The US Homeland Security terrorism alert system was originally conceived as a great tool of emergency risk communication, using a familiar green, amber and red colour-coded system and giving all sectors of society constructive things to do at each alert stage. But it was eventually abandoned because barely anyone — among the public or even in the emergency response community — paid attention to it. Why? Because it remained at a high or elevated level of threat continuously for years. If something is always a threat, eventually we adapt and, after a time, it’s not perceived to be a threat anymore.

With the Hawaiian false alarm, there was a massive emotional impact because of the possibility of an armed attack, particularly from North Korea.

Risk perception is a cornerstone of preparedness because a person’s emotional response to a perceived hazard often drives positive changes in behaviour. While we may consider ourselves rational, if the truth be told, gut reactions to frightening or uncertain situations are a more powerful motivator than any amount of calculus or reasoning we can muster.

With the Hawaiian false alarm, there was a massive emotional impact because of the possibility of an armed attack, particularly from North Korea, which had recently claimed to have tested a new type of intercontinental ballistic missile capable of striking the US. Hawaiians endured what must have felt like, for some, the end of days.

Of course, the state’s authorities must apologize for the emotional trauma this event caused. While fear can be a powerful behavioural motivator, emergency management officials must be cautious to not frighten the public unnecessarily. Fear-based communication can easily backfire, prompting citizens to engage in behaviours that undermine their safety. Further, the system must develop safeguards to prevent similar false alarms from occurring in the future.

It’s easy to imagine that authorities are hoping the entire episode will be forgotten, but that would be a mistake. In fact, even as their credibility has come under fire, now is the time to act.

By chance, rather than design, a golden opportunity to increase emergency preparedness has presented itself. Involved organizations should seize this opportunity to respectfully tap into the intense emotion this warning error has generated. Concern about emergencies is surprisingly rare. When it exists and emerges in the public discourse, officials too often fail to put it to good use. If authorities truly are concerned that the public and its response partners are not ready for a potential attack or other serious emergency, they should identify ways of converting this awkward and embarrassing event into a teachable moment.

Photo: Shutterstock, by Fiona Lin.

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous rĂ©agir Ă  cet article ? Joignez-vous aux dĂ©bats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

John Rainford
John Rainford is the Director of The Warning Project. He is the former Director, Emergency and Risk Communications for Health Canada and Global Project Lead, Risk Communication Capacity Building for the World Health Organization.
Joshua Greenberg
Josh Greenberg, PhD, is a professor of communication and media studies in the School of Journalism and Communication at Carleton University. His research expertise is in the area of crisis and health risk communication. He has collaborated with and provided expert guidance and advice to the World Health Organization, Transportation Research Board (US), Council of Canadian Academies, and Public Health Agency of Canada, among others.

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un pĂ©riodique imprimĂ©, sous licence Creative Commons Attribution.

Creative Commons License