In May, Ofcom, the U.K.’s communications regulator, proposed a package of measures to implement the Online Safety Act aimed at improving children’s safety on social media.

These proposals offer important insights for Canada as Parliament considers the government’s Online Harms Act, which would establish a digital safety commission charged with promulgating regulations related to “privacy settings for children and other age-appropriate design features.”

Ofcom’s proposal includes practical measures that can help create safer digital environments. For instance, there are product design and service rules requiring tech companies operating online services to:

  • reduce the prominence of problematic content on children’s recommendation feeds;
  • ensure prominent display of easy-to-use moderation tools that allow negative feedback;
  • implement customer-service systems designed to swiftly take action against flagged content and respond to user inquiries;
  • appropriately train, staff and provide resources for moderation teams;
  • appoint a person accountable for complying with children’s safety duties.

These are reasonable regulations. However, another measure in the proposed U.K. package is a step too far – mandating age assurance to access online services that may contain content that is dangerous, pornographic, violent, bullying or bulimic.

Age assurance means requiring a government ID or other method, such as facial age estimation or the use of digital identity services, to sign up for a social media account and access a variety of online services.

In addition to access to information and subjectivity concerns regarding what constitutes specific types of content, there are two issues here: privacy and user experience.

For these reasons, the Canadian Parliament should ensure the Online Harms Act is amended before final approval to proscribe the ability of the proposed digital safety commission to write regulations mandating age assurance to access online content.

What’s at stake

There are two big privacy issues related to requiring age assurance to use online services.

First, age-assurance requirements put platforms in the position of processing and possibly retaining sensitive, personally identifiable data about their users.

If this involves using a government-issued ID to verify a user’s age, the often-stated analogy of requiring a retail store cashier to check a customer’s ID when purchasing an adult product fails.

When I worked in retail, I checked hundreds of IDs to ensure compliance with self-regulatory standards around M-rated (mature) video games. However, those were one-offs that never included me entering sensitive data such as a customer’s home address and date of birth into a corporate database.

Other age-assurance methods, such as facial or algorithmic estimation, come with severe trade-offs between privacy and safety from online harm. As one privacy analysis notes, such methods can be based on data points including one’s browsing history, voice, gait or other device signals. That is a lot of data that one must share to simply access online services.

Second, government-mandated age assurance to access online services forecloses the possibility of using digital spaces anonymously for collective action.

While anonymity may grant some abusers a licence to engage in online harassment, it also fosters a culture of free expression where individuals can challenge political power or find a community online.

This is a foundational principle separating democracies from their authoritarian counterparts. China, for example, has rules requiring all players of digital games, regardless of age, to register online accounts under their legal names, verified by government-issued identification.

Online video games are increasingly important avenues for political mobilization because virtual spaces may serve as useful protest tools when authoritarian governments crack down on physical demonstrations.

Online harms need a gender-based analysis

The Online Harms Act should target social media’s greatest harm

Federal legislation needs further amendments to protect children’s privacy

Online Harms Act: a step in the right direction to protect Canadians online

In democracies, we should abandon anonymity only in exceptional circumstances and remain skeptical of measures that further online surveillance while adding barriers to access information.

The other issue is the scale of friction this will add to consumers’ online experience. Most adults are not going to want to get out their ID every time they or their dependents are required to create an account for a new app or online service.

If you think digital cookies are annoying, just wait until you must scan your driver’s licence or link to your preferred digital identity service before accessing a website.

Moving Forward

We know that children are engaging with a range of online services, encountering both educational and problematic content. We can work to foster a safer online experience while doing our best to protect personal information.

A potential solution is to not enact the age-assurance requirement while moving forward with the other practical operational and product design measures outlined above. Then, we can research the effect of such measures on creating safer products and services to determine if further action is needed.

If we must go the way of age assurance – a trend seen in other jurisdictions such as California – we should emulate Apple Pay’s encrypted, on-device system for verifying and storing sensitive information.

It makes much more sense in terms of privacy and user experience to have consumers verify their identity once when setting up a new device or creating an app store account. Yes, honeypots would still exist, but they would be less numerous, contain only partial data and would ideally be guarded by robust cybersecurity teams.

As Canada moves forward with the Online Harms Act, the country has the chance to lead on age-appropriate design that does not sacrifice privacy rights. It should seize that chance.

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission, or a letter to the editor. 
Dakoda Trithara
Dakoda Trithara is an assistant professor of political science at the University of South Carolina Aiken. He earned his PhD in political science from the University of Calgary. His research explores how global actors contest and shape digital politics.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License