The app was billed as a way for individuals to alert people when they had come into close contact with someone who had been infected with COVID-19, without collecting personal data. But it required users to enter a one-time key, given to them when they received a positive PCR test result – a cumbersome system. It also almost immediately ran into problems with jurisdictional disputes.
As well, usage never met expectations. While 6.89 million people had downloaded it as of Feb. 1, 2022, only 57,704 user keys were activated.
The app was also plagued with issues surrounding the way the technology was developed and implemented. It’s critical now to have a detailed post-mortem on the entire project so that governments can avoid these problems as they prepare for whatever emergency comes next.
It is always difficult to deploy public technology, such as the COVID Alert app, because such an approach is inherently experimental. As such, public technology requires specific governance protections and frameworks. When governments develop, fund and/or approve vaccines, for example, they require testing, resourcing, production and distribution through certified channels. Health authorities proactively work to prevent snake-oil solutions that can cause people harm. Yet when a government deploys technology with the full weight of its political credibility, there is significantly less infrastructure to develop or vet that deployment, as well as maintain its integrity.
According to the final report of the advisory council tasked with overseeing the app, “COVID Alert notifications have helped identify at least 2,446 confirmed cases of COVID-19,” from April 1, 2021 to May 31, 2022. But that number doesn’t tell us whether the app met its goals or whether it was worth the $20 million spent, because the government never defined what the app was supposed to achieve.
More specifically, when the government launched COVID Alert, it did so with a vision for how people might use it, but not how that use would translate into public health outcomes. The government didn’t lay out the desired role and performance metrics, or a management and governance framework that described when, why or how the app would be adapted or shut down.
A prime example of this governance failure was jurisdictional. The provinces and territories are the front lines of testing and test-results management. Yet in the fall of 2020 in Ontario, the process of getting test results from labs to people and into the app was failing. The issue was never resolved.
Toward a full life-cycle based policy framework for public technology
The hardest part of an emergency policy response is the beginning, a time when urgency is high. In this context, the incentives are all focused on “solutions” – often without the kind of scrutiny or diligence on which we rely to protect us from unanticipated harms. Ultimately, building and maintaining tech is the easy part. The hard part is democratic governance.
To try to organize these governance issues somewhat, we propose the use of a five-step policy framework that borrows from a commonly used technology management approach: the product life-cycle.
1. Technology proposal and due diligence
Stay in the know with veteran reporter Kathryn May. Sign up for routine and out-of-the-ordinary news about the public service with The Functionary, our new newsletter.
Due diligence requires a good look at if, and when, a proposed technology was used in the past. What was the outcome historically? This is also the time to consider key technical requirements. With COVID Alert, the primary technical requirement was privacy. But privacy superseded efficacy and became a goal unto itself, detached from what should have been the goal: the public health impact.
Using open source code, which can be another technical requirement, and which was done with COVID Alert, can support increased external oversight. That is helpful in building trust and leveraging outside community supports for ongoing development and maintenance. In the COVID Alert case, however, there was no openness or transparency around the provincial parts of the process and related systems. An open source approach has to be broad and wide, and must apply to governance as well, to be fully effective.
Usually by the time a public technology idea has advanced to the point of a “go/no-go” decision, both the government and the vendor – if the tech is externally sourced – have invested significant amounts of time and are contending with path dependency. Both parties have an incentive to say “go,” regardless of whether that’s what’s best for the public. It’s important to have diverse stakeholders engaged at this stage of the work – stakeholders with the procedural space and cultural support to say “no-go,” even if that means doing nothing. Doing nothing is better than investing heavily in an ineffective technology.
Defined benchmarks for what success and failure look like, as well as an established plan for maintenance and decommissioning, should be policy requirements for a “go” decision. Similarly, when governments launch public technologies, they should have to explain the degree of their own authority that they’re going to exert to enforce use of the tech. A positive feature of the COVID Alert app that was a condition for “go” was that its use was voluntary. This minimized threats of over-enforcement and coercive surveillance. By contrast, ArriveCAN, which is still in use, is a mandatory border control mobile and web app that warrants more attention.
Public communication, modes for redress, accountability and oversight sit at the heart of an effective launch plan. With COVID Alert, there was no path to be engaged with the app when changes were suggested. The data provided wasn’t helpful to understanding the app’s public health impact. There was no persistent help for people in Ontario struggling with the one-time codes. The advisory council’s reports went quiet for months, then the council itself was disbanded with no public announcement.
A good launch plan includes clear public guidance about benchmarks, a commitment to persistent and scheduled reporting, and processes for ongoing consent and change. It will set expectations that the technology may change or adapt, how that might look, how the public will be engaged in those changes, and when and how the tech will be shut down. Each of these elements also play a role in building people’s power over and trust in a technology, which is a critical element of securing adoption.
4. Maintenance, iteration and review
This phase is generally the longest one, time-wise. Internal processes for the management of the app have to budget to support the constant work of assessing efficacy, troubleshooting and maintaining the tech. This includes the work of identifying the types of changes that would, and wouldn’t, warrant external and public consultation. External oversight processes need to be established to map to and extend the internal maintenance, iteration and review processes to include and inform the public.
5. Decommissioning and public post-mortem
The final phase focuses on archiving code, managing and destroying data, and sharing lessons learned. This includes public post-mortems for both successful and failed experiments, framed by inputs provided by the public. A formal post-mortem of COVID Alert should be held now to examine the jurisdictional and operational issues that emerged when the federal government was the business owner for an app that was operational primarily at the provincial level, and how it accepted requirements imposed by Google and Apple that did not map to the Canadian health-care system.
COVID Alert was a public technology pushed past the basic diligence or public accountability that normally accompanies the exercising of emergency powers. The government decision to decommission it deserves praise, not just because it’s good management but because it creates the opportunity for post-mortem analysis. This was neither the government’s first emergency, nor its first use of technology. The current emergency use of the ArriveCAN border-control app is also rife with governance issues, displaying the symptoms of a lack of good governance all over again. Without better governance, the public deployment of technologies in response to future emergencies will just be one disaster after another.