As COVID-19 continues to surge, researchers and technologists are mobilizing to apply the best available tools to mitigate the virus’s impact. To date, the most prominent examples of technology-driven solutions during the pandemic have focused on enhancing disease surveillance through contact tracing. These applications present difficult challenges, from potential risks to rights and freedoms to the creation of new powers of state surveillance. Novel approaches to data governance — the way we enforce rules on how information is collected, stored, shared and used — could help manage potential trade-offs in using this technology to help contain COVID-19.

In China, Israel, Singapore and South Korea, governments have taken steps to limit privacy rights and shore up digital surveillance to track COVID-19 infections. These countries are using data from CCTV cameras, cellphones and credit card transactions to implement — and enforce — targeted social distancing measures based on level of risk. Although some question the efficacy of these approaches, real fears about the gravity of the crisis, along with reports highlighting the success of contact tracing in curbing not only the spread of the disease but also the devastating social and economic consequences of lockdowns, have encouraged other countries, including in Europe, the United States, and Canada, to consider similar options.

Recently, Yoshua Bengio, a leading AI scientist (and co-founder of Element AI), outlined a different approach, what he calls “peer-to-peer AI-tracing of COVID-19.” In Bengio’s scenario, no personal information about an individual, such as identity or location data, would be collected. Instead, anonymized information about medical test results or contact with a high-risk individual would be uploaded to a “non-governmental data trust”: an independent third party with a fiduciary obligation to manage the data according to a well-defined charter. Subject to the trust’s approval, the data could be used to build an app that would alert individuals to their risk of being infected based on their proximity to others, enabling them to adjust their social distancing practices in real time.

As apps like Bengio’s prepare for launch, the idea of relying on a data-trust-like structure to govern the development of technology used in COVID-19 responses should be considered more broadly. Entrusting decisions about the collection, sharing and use of data to an intermediary that is accountable to the public could ensure that decisions about competing rights — such as privacy and the freedom of movement with the right to health — are managed in the public interest. A trust-like structure that is independent from the government could also help lower the risk of creating a host of tracking systems that could be repurposed for population surveillance once the coronavirus crisis ends — a “coronopticon” that would have enduring consequences for our democracies and human rights and freedoms.

Relying on the input of a group of experts, Element AI has previously identified data trusts as a potential tool to reinforce data governance. In the past couple of years, governments like the United Kingdom and organizations such as the OECD and the G20 have endorsed data-trust-like structures. In Canada, as part of the announcement of its Digital Charter, the federal government recognized data trusts as a way to promote data sharing, including in areas such as health, while enhancing privacy and security.

Chartering a data-trust-like structure for COVID-19 response

As time is of the essence in the current emergency, we should look to existing entities that can play the role of trusted intermediary, rather than establish new institutions. Candidate organizations could include government agencies with expertise in data collection and governance such as Statistics Canada, Crown corporations like the Standards Council of Canada, the public health agencies, partner organizations from Indigenous communities and similar bodies in the provinces. One or several of these entities could be selected to establish the trust.

In terms of governance, the trust itself would be a legal entity managed by a board of trustees that would have fiduciary obligations to the public interest. The data trustees would act as intermediaries who decide the terms of data access and use by both public and private sector actors according to the trust’s charter, a document that sets ground rules for the management of the data held by the trust. The trustees would also have the power to revoke access to the trust’s data and take legal action to safeguard the public interest in the event of noncompliance. At a minimum, the trust’s charter should include respect for the following principles:

Public interest. Data should be accessed, shared and used for a public benefit: namely, to advance COVID-19 emergency response efforts. In doing so, the trust should prioritize the needs of patients, public health authorities, humanitarian groups, medical researchers and practitioners.

Rule of law. Limitations of human rights and freedoms may be justified only if they are temporary, necessary and proportionate to the aim pursued.

Scalability. The trust should be scalable and should facilitate international data sharing. This would enable researchers and other actors across the world to build better models, risk estimators and response strategies.

The inner workings of government
Keep track of who’s doing what to get federal policy made. In The Functionary.
The Functionary
Our newsletter about the public service. Nominated for a Digital Publishing Award.

Representation. The board of trustees should combine relevant expertise and civic representation, including data governance experts, public health authorities, humanitarian and emergency response personnel and civil society groups.

Standardization. In the spirit of the Digital Charter, the trust should apply recognized best practices, including standards on data quality, security, minimization, anonymization and algorithmic fairness, accountability, transparency and explainability.

Transparency. Information about the trust’s activities, including how data are collected, shared and used, should be made publicly available.

Accountability. Accountability could be achieved by publishing audit reports of the efficacy and compliance of digital solutions developed through the trust, with support from third-party assessment bodies, such as the Council of Canadian Academies.

Remedy. The charter should include internal grievance and remediation processes for individuals whose rights have been impacted as a result of an action of the trust, including the operation of an algorithmic system trained on its data. 

Expiration. The charter should include sunset procedures for winding down the operations of the trust at the end of the crisis, which could include deleting some of the data it holds. The trust could live on, however, to ensure that certain information remains available to advance scientific research or to help prepare for future outbreaks.

Emergency situations such as pandemics are delicate times to implement new public policy arrangements. But there should be room to empower existing institutions to apply emerging best practices. Right now, public health and medical professionals are doing just that — whether they are developing new emergency protocols overnight, repurposing used medical equipment for their own and others’ safety or innovating on the fly to expand the number of beds in intensive care units overwhelmed by disease. We can use technology to help; but, in doing so, we should also ensure that all are able to rest easy in a free and democratic society once this crisis is over.

This article is part of the The Coronavirus Pandemic: Canada’s Response special feature.

Photo: Shutterstock, by BABAROGA

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission, or a letter to the editor. 
Philip Dawson
Philip Dawson is lead for public policy at Element AI, a global AI solutions provider headquartered in Montreal, where his policy work focuses on the governance of data and artificial intelligence. He is the co-chair of the Canadian Data Governance Collaborative.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License

More like this