Bill C-11, which brings online platforms such as YouTube, Netflix, Apple TV, TikTok, Instagram within the scope of the Broadcasting Act of 1991 – raises important questions about Canada’s sovereignty over cultural policy. The dissemination of media content through online platforms also raises broader questions of data governance and control over our own personal data.

These issues follow decades of the federal government doing little to ensure that the Broadcasting Act was kept current, while online giants progressively expanded their power. American legal scholar Julie Cohen notes the size of these platforms means they often share the attributes of the most powerful states in terms of population, territory and network capacity as well as in the shaping of news, ideas, preferences, laws and norms.

However, these online giants are not democracies. They do not exercise these powers through established legislative and administrative processes including checks and balances. Instead, they operate with little transparency or accountability and take advantage of weak privacy laws to encroach on users’ personal data.

There are many valid concerns about the scope of Bill C-11 to regulate these online “undertakings.” For example, what distinguishes “undertakings” from users generating their own content? When is content a “program” subject to regulation? What is “Canadian content?” How will discoverability requirements serve the interests of Canadian content creators, viewers and our cultural policy?

But at the very least, these questions are up for public debate.

Once in force, the act’s implementation continues to be subject to public scrutiny through the Canadian Radio-television and Telecommunications Commission (CRTC), the judiciary and the application of the Canadian Charter of Rights and Freedoms along with other laws, as imperfect as these frameworks might be.

Recently, I along with Jacqueline McLeod Rogers analysed the expansion of broadcasting regulation to online undertakings (the main purpose of Bill C-11) as a competition between sovereignties: the state, the online platform and our own personal data. We concluded that there is a need for some form of regulation and oversight of online platforms.

This falls under Canada’s jurisdictional powers and arguably is required through the principle of technological neutrality. Expanding broadcasting regulation to online undertakings is also consistent with national cultural sovereignty, which has underpinned many iterations of our national broadcasting regulation as a single system over more than the last 50 years.

We also conclude that with the rapid convergence of broadcasting and telecommunications, any future regulation must include greater transparency, accountability and protection. This applies to the handling of viewers’ personal data, as well as media content data governance. It should be considered a matter of public interest and national cultural policy and sovereignty. However, unfortunately, the handling of personal data by media – and more generally big-data governance by broadcasting undertakings – have been overlooked in the debates on Bill C-11.

Protection of viewers’ personal data

Shoshana Zuboff , the Charles Edward Wilson professor emerita of Harvard Business School, depicts the digital world as one marked by loss of boundaries. This leaves us vulnerable to prying into our personal information and letting corporate marauders move at will. Online undertakings offer targeted menus of media services, which are devised to curtail viewer options and maximize corporate profits.

The inner workings of government
Keep track of who’s doing what to get federal policy made. In The Functionary.
The Functionary
Our newsletter about the public service. Nominated for a Digital Publishing Award.

We currently navigate through and select from these targeted offerings with little state protection. To paraphrase Ronald Deibert, director of the Citizen Lab at the Munk School of Global Affairs and Public Policy, we are rendered vulnerable to attack, without fortress, gate or barrier to incursions. We have a situation in which anyone can watch us, all guards down.

In a 2019 paper, Robert Hunt and Fenwick McKelvey suggest algorithmic personalization of media content (generated from the handling of our personal data) are a form of cultural policy. In other words, it is the management of “cultural expression through code.” It is a form of curation, a main function of traditional broadcasting, subject to the Broadcasting Act’s framework and policy goals.

Yet personalization diverges from traditional broadcasting curation in important ways. As media researcher Tanya Kant points out, personalization of media content amounts to a form of narrowcasting.

Traditional broadcasting offers the same programming to all viewers. System-initiated personalization changes the discoverability of offerings in ways not necessarily transparent to viewers, and as such may compromise their sovereignty, because it is not solely serving viewers’ autonomy and empowerment, but also surreptitiously steering viewers’ choices to serve the platform’s interests in ways that might be manipulative and deceiving.

Ensuring tighter control over the extraction of personal data related to media content consumption is even more critical than other online e-commerce transactions. As such, media content regulation needs to pay attention not only to content and how it is produced and disseminated, but also to what happens behind the scenes when viewers are watching or listening.

Media content data governance

Canada’s data sovereignty needs to be reassessed in light of the intensification of the use of personal data of media content viewers. At the infrastructure level, does it serve Canada’s broadcasting policy to have the CBC/Radio-Canada use YouTube as a default platform for its online programming? What agreement is in place to ensure the interests of Canadian viewers are protected and in line with Canada’s broadcasting law? These concerns have been overlooked in the ongoing debates around Bill C-11.

Incremental changes to a legacy legal framework

Broadcasting legislative reform currently underway is not about a complete overhaul of the current regime. It is about incremental changes made to existing legal frameworks without disrupting the base. This is because traditional media are never completely erased by new media.

Incremental changes are made through Bill C-11 by explicitly bringing online undertakings under its umbrella. Two fundamental changes are the requirement for online platforms to fund the creation of Canadian content in an equitable manner, and requirements of discoverability of Canadian content for their services offered in Canada. These requirements are nowhere near the content regulatory requirements applying to the traditional broadcasting undertakings under the Broadcasting Act.

In their current and proposed form, the regulatory powers of oversight of media content are a soft yet important exercise of cultural sovereignty. Such regulatory oversight powers are meant to ensure respect for minimal quality standards of news; adequate regional representation; the specificity and promotion of Indigenous Peoples and languages; of the French language; of French and English linguistic minorities; racialized communities; people with disabilities; and LGBTQIA+ voices. To the extent that national cultural sovereignty, public interest, technological neutrality, privacy and personal data protection matter, a lot more can – and should – be done.

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission, or a letter to the editor. 
Pascale Chapdelaine
Pascale Chapdelaine is an associate professor in the faculty of law at the University of Windsor.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License