S-210 is a private member’s bill introduced by Senator Julie Miville-Dechêne. Its central goal seems like a no-brainer: to keep young people from encountering media they are not equipped to see. It aims to achieve this goal by requiring people to use a third-party service to verify their age to access adult content online in Canada.

Part of the motivation for the bill is to end violence against women. The notion is that men who watch pornography from a young age can go on to inflict violence on women as those men get older. In short, violence against women is pornography’s fault, the reasoning goes. And to end that violence, we need to end access to pornography.

Where the bill goes wrong is in the definition of pornography, which starts with women’s breasts. In doing so, it could reinforce a stigma against women. After all, if women’s bodies on their own are pornographic, and pornography is the cause of violence against women, aren’t women actually to blame for the violence they encounter?

This sounds a lot like upholding a cycle of blaming women for their own abuse instead of making meaningful and inclusive social policy.

But regulation doesn’t have to be that way. The creation and promotion of media literacy programs aimed at viewers of sexually explicit content would go a long way toward raising awareness of unrealistic messaging and depictions in pornography. A push to expand the use of device-based parental-control options must also be part of the way forward.

Less information, more violence

The bill takes the definition of sexually explicit material from section 171.1(5) of the Criminal Code of Canada, which begins at exposed female breasts. In this line-drawing exercise of breasts-as-necessarily-pornographic, women’s experiences will be silenced online.

This reduces access to comprehensive health and safety information, including information for patients and survivors of breast cancer and victims of domestic and/or sexual abuse. All of this increases violence against women. What’s worse: the bill’s roots suggest that censoring women may be what it was designed to do.

Starting as S-203 in 2020, the bill found its base in anti-choice, anti-women’s sexuality and anti-2SLGBTQ+ rhetoric. The sponsoring senator quotes the American College of Pediatricians, an American hate group that is not the legitimate American Academy of Pediatrics but capitalizes on its similar name to spread misinformation. With this base, it’s not surprising that the worst of the myriad harms of S-210 will be borne by women, 2SLBTQ+ folks, and other marginalized communities.

Rape myths and sexism still cloud police responses to sexualized violence

The misogyny of the so-called “rough sex” defence

“Extreme intoxication” appeal decision is yet another blow to women

Proponents of the bill reason that people have to show identification to enter a strip club or buy liquor or cigarettes. They argue that age verification online is no different.

But we’re not talking about controlling access to a physical space or object. It’s access to ideas, to data. It’s ones and zeroes strung together and transmitted across wires that represent our online lives, which are inseparable from our physical ones. And to intercept specific ideas, you need to control access to all ideas.

S-210 would allow the government to target any platform that sexually explicit material can be found on. It was initially aimed only at major pornography websites – think Pornhub and the like. Discussions at committee then evolved to apply to any platform with 33 per cent or more of its content classified as adult. Now, the sponsoring senator and the Age Verification Providers Association that lobbied for this bill have reset their tone for the worse.

The bill’s near-unlimited scope now includes all social media, search engines, and even messaging services. Under 171.1(5) of the Criminal Code, the ideas being controlled here are sexually explicit ones contained in video, photo, written, and audio materials. If it were to proceed as drafted, content moderation on those platforms would need to go nuclear.

Anything that could be defined as sexually explicit material under the bill would need to be filtered out to avoid creating barriers to traffic flows. After all, the generator of profit for most of these ad-based, free-to-use services is their sheer number of visitors, which would be hampered if they are required to use age verification tools.

Women’s bodies are seen by both humans and AI alike to be more sexual than men’s. Because their breasts alone are enough to constitute pornographic content, more of women’s posts on the internet are likely to come down. Breast cancer survivors, women’s reproductive health companies, and sexual health education agencies have all had their content taken down for violating “community standards” when posting images or text about women’s health.

Bringing fairness to campus sexual violence complaint processes

Legal gaps persist for intimate partner sexual violence after key ruling

When information on safe abortion is unavailable due to content moderation, maternal mortality increases. When women can’t share information on their breast-cancer journeys, others won’t detect their own cancers earlier. Quite simply, strengthening content moderation against women’s bodies will cost lives.

But I am not saying we should do nothing. Quite the opposite. There could well be harm that flows from watching media that presents social or sexual situations that young people are unequipped to experience. We should do something, but something that actually works and does not implicate equality or privacy rights of all Canadians.

Educate the young and old

Australia recently rejected an age verification bill and recommended an approach very similar to what I will propose here: media literacy programs that focus on attacking the perceived realism of media that young people encounter. The higher the perceived realism of a piece of media, the more likely that person is to believe that it represents real-world truth.

This can be addressed in age-appropriate ways, including demystifying production sets and explaining how special effects are achieved. Educators and/or guardians can also discuss relationships modelled in all media and encourage young people to think critically about why they’re presented in a certain way.

Studies have shown that simple educational messaging that the content in adult films is not necessarily how everyone likes to have sex can decrease perceived realism and even correct rape-supportive beliefs. We just have to be able to talk about it.

Strategic education programs aimed at caregivers that demystify device-based content filters, also known as parental controls, are also incredibly useful. These filters ensure that only users who shouldn’t be encountering sexually explicit content aren’t encountering it.

For those who say that pornography truly is the problem, I hear you. As a teen, I hated the unrealistic body image that some porn communicated, and I saw it as cheating when my partners looked at it. I might have even applauded Bill S-210 if it had been tabled at the time.

But I’m not that anti-porn critic anymore.

I’ve come to realize how blanket criticism of porn and sex work creates opportunities to police all women’s bodies and sexualities – just like S-210 would do.

In my late teens, an ex-boyfriend shared intimate images of me online without my consent. I stepped back and looked critically at what was happening. Someone was trying to leverage my sexuality as a young woman to ruin my life.

Fake porn causes real harm to women

What’s at stake in supporting sex workers’ right to health

Preventing sexual violence on campus

This guy saw the hateful way we, and even I, treated people who work in sex work. He saw how society treats women who are not chaste. He found power in that parallel and used it. He knew that by “demoting” me to the ranks of an online sex worker, I could lose jobs, have my family reject me, lose friends, lose opportunities for education. Damage could be inflicted because of anti-women attitudes in society. That’s why that kind of sharing works.

This bill doubles down on those attitudes. Victims of sexual assault will have to fight even harder to establish that their assault was not their fault because this bill implies that sexual violence is the fault of pornography, not of the people who commit violence. And if women’s bodies are pornography, it’s pretty easy to continue the cycle of blaming the victim for her own abuse. That price is simply too high.

Rather than surveilling the salaciousness of any post related to women’s bodies, a two-pronged approach using device-based controls and media literacy must be adopted.

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission, or a letter to the editor. 
Kate Sinclaire
Kate Sinclaire is a law student, sex-worker rights advocate, and award-winning queer feminist filmmaker. Her work centres on the intersection of technology, law and policy with social conceptions of gender and sexuality. X: @mskatesinclaire. Instagram: @mskatesinclaire

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License