The former Trudeau government’s Artificial Intelligence and Data Act, part of the larger Bill C-27, died when Parliament was dissolved earlier this year for the April election. It has not been reintroduced, leaving Canada lagging other countries in dealing with important issues of data sovereignty, especially in health care.

These issues are critical and urgent. AI tools are already being used in pilot projects in frontline health services across the country. In some Northern and rural clinics, transcription software is being tested in counseling and wellness sessions.

This is not just a tech issue. It is about sovereignty, health and trust. Unless Canada acts, we risk outsourcing not only our data but also our ability to govern wellness itself.

There are four steps Ottawa should take: prohibit the export of patient and community data without explicit consent; embed Indigenous frameworks in data governance; ensure federal procurement contracts contain strict data-protection measures; reintroduce and pass C-27 or similar legislation.

Promise vs. practice

Across Canada, health providers are experimenting with AI to ease administrative burdens and speed reporting. These pilot projects often focus on transcription tools, patient monitoring or predictive analytics.

On paper, they promise efficiency. In practice, these tools collect sensitive health conversations, treatment notes and cultural context data that may sit on Canadian servers but are available to the supplying companies that are often operating under foreign law.

While many vendors claim that data is stored locally and deleted after a set period, there is no independent guarantee or contingency plan to ensure compliance. In the meantime, precious Canadian health data continues to fuel foreign economies.

In Northern settings, where health departments already struggle to meet complex federal reporting requirements, the appeal of these tools is clear. But without explicit legislation, there is little to stop this health data collected in Canada from being owned, monetized or even sold by firms based abroad who provide these AI tools.

For mental health, the stakes are particularly high. Notes from therapy sessions or addiction treatment programs are not just data points. They are people’s most vulnerable moments. Once exported, they can be used to train models, design treatment algorithms or build products over which Canadians have no say or control.

The same is true for Indigenous communities. For example, in many Yukon First Nations, health and wellness are inseparable from governance. Clan systems, citizen engagement and land-based healing inform how services are delivered. When transcripts or records tied to these practices leave the community, their sovereignty is compromised.

Indigenous data sovereignty: the litmus test

Indigenous nations have long articulated principles for responsible data governance, best captured in the OCAP principles (ownership, control, access and possession).

Yet AI pilot projects in health care rarely consider these frameworks. In my work with Northern communities, I have seen how digital tools often arrive with little pre-consultation on governance – a gap that echoes colonial decision-making.

When transcripts of counselling sessions are uploaded to a foreign-owned platform, neither the patient nor the community retains meaningful control. This is not just a privacy breach. It undermines reconciliation and the self-determination of Indigenous nations that have fought for decades to reclaim authority over their knowledge.

If Canada cannot uphold Indigenous data sovereignty in the digital era, what credibility does it have in protecting data sovereignty for any citizen? We should govern everyone’s health data in ways that prioritize trust, transparency and consent.

The global race for rules

Other jurisdictions have recognized the urgency. The European Union’s AI Act sets strict rules for how AI interacts with health data. The United States has issued executive orders aimed at safeguarding sensitive datasets.

By contrast, Canada is stuck in legislative limbo. Evan Solomon, Canada’s first minister of AI, said recently that Bill C-27 is not “gone,” but may come back in a different form.

AI threatens Indigenous data sovereignty and digital self-determination

A made-in-Canada approach to AI

Canada’s next great project must be to build an AI nation

Meanwhile, the lack of federal action leaves health providers to navigate AI adoption without guidance, while foreign platforms continue to expand their footprint. If we do not act soon, Canadian health data will become just another input in global AI economies, governed by rules we did not write, benefiting companies we do not control.

Here’s what Canada can do:

  • Mandate data residency for health AI tools. Patient and community data must remain in Canada, stored on servers subject to Canadian law. No exports should be permitted without explicit, informed consent.
  • Embed Indigenous governance frameworks. OCAP® should be a baseline requirement, not an afterthought. Indigenous-led governance must be central to how AI interacts with community wellness data.
  • Implement procurement safeguards. Every health contract involving AI should include clauses on data residency, intellectual property, auditability and sovereignty. Governments can set the tone by refusing to buy tools that do not meet these standards.
  • Accelerate legislation with a health-first lens. Parliament should treat AI in health and wellness as a priority area for legislation. Delay is not neutral. It actively erodes Canadian sovereignty.

Ready or not, it’s coming

AI is arriving in health care whether Ottawa is ready or not. In Northern regions, clinics are balancing innovation with risk, Indigenous nations are questioning how their citizens’ stories are handled and governments are moving faster on procurement than policy.

Every week of delay in passing the necessary federal legislation means more Canadian health data leaving the country, more Indigenous governance ignored and more trust eroded in the systems designed to support wellness.

The question is not whether Canada should regulate AI. It is whether we will do so in time to prevent irreversible losses. Our sovereignty cannot be an afterthought. Neither can our health.

If we want Canadians to trust AI in health, we must show them that their data, their stories, their struggles and their wellness belong to them, not to foreign servers. The time to act is now.

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Taryn Ellens photo

Taryn Ellens

Taryn Ellens is a PhD student in neuroscience and mental health at the University of Alberta and founder of AInome, advancing ethical AI-enabled mental health systems grounded in Indigenous data sovereignty and governance.

Related Stories