As Parliament moves ahead with its study of legislation to offer better protection from online harms, Bill C-63 contains some glaring gaps that risk leaving women and girls in Canada unfairly exposed. 

Technology-facilitated violence that Canadians have been increasingly experiencing does not impact everyone in the same way and at the same rates. Developing effective legislation to prevent and address online harms requires gender-based analysis. 

To understand why, look no further than one emerging form of online abuse: non-consensual sexual deepfakes. A sexual deepfake is exactly what it sounds like. Using artificial intelligence, survivors are surreptitiously construed to appear as though they are appearing consensually in digitally manipulated images of a sexual nature.  

One simple and increasingly common example of this occurs when an image of a survivor’s face (which can easily be retrieved from any number of sources including social media pages or publications) is digitally “attached” to a naked body belonging to someone else to make it appear as though they willingly posed nude.  

Deepfake sexual abuse targets primarily women and girls. And survivors experience various consequences that impact women and girls in disproportionate ways. The bottom line is that deepfake sexual abuse needs to be assessed and addressed as part of the broader spectrum of gender-based violence. Until that happens, acknowledgement of the harms caused by this emerging form of sexual violence and consequences for perpetrators will remain insufficient. 

Use of deepfakes accelerating 

Deepfake sexual abuse is occurring at increasing rates and finding access to the technology to make sexual deepfakes is becoming more widespread and advertised, including on platforms like Instagram. The result is an unprecedented amount of sexual violence against women online that is being widely distributed.   

The rapid development of artificial intelligence tools has also empowered deepfake sexual abuse perpetrators by helping to produce images all the more convincing. Faces are merged with bodies sharing similar height, weight, skin tone, and hair colour. 

Gendered targets  

Deepfake sexual abuse has been gendered since it was first widely shared in a Reddit forum in 2017. At first, targets were primarily women celebrities.  

A 2023 study of 95,820 deepfake videos found that 98 per cent featured sexual content and 99 per cent of individuals targeted in deepfake sexual abuse were women and girls.  

Indeed, many popular deepfake apps for deepfake sexual abuse only work on women’s bodies 

But, as the abuse of this technology continues to morph and grow, in-depth research is needed as intersectional data is lacking. We may be able to learn from data around non-consensual intimate image distribution, an analogous form of technology-facilitated sexual violence, where Black, 2SLGBTQIA+, Indigenous, and disabled communities are disproportionately targeted.  

Gendered impacts 

The impacts of deepfake sexual abuse are vast and devastating. Highly publicized examples of survivors including Noelle Martin, Breeze Liu, and Francesca Mani have helped shape our understanding of the ways in which it results in physical, mental, relationship, and economic impacts.  

For women and girls specifically, pre-existing sexual double standards position them as more morally reprehensible than men for engaging in sexual expression. As a result, women and girls who have sexual content posted online, whether it is consensual or non-consensual, are blamed, shamed, and punished in a way that men and boys are not.  

In addition to the myriad of negative impacts experienced by survivors of deepfake sexual abuse, there are also broader societal impacts including sexual objectification and silencing.  

Online Harms Act: a step in the right direction to protect Canadians online 

Fake porn causes real harm to women 

Series | Improving Canada’s Response to Sexualized Violence 

Sexual objectification of women and girls is rampant in society and perpetuates harmful and false views that women are less competent, moral, and intelligent. Sexually objectifying media is also a causal risk factor for sexual violence as it sets a standard that women and girls be treated as sexual objects. Deepfake sexual abuse also has the effect of silencing women and girls in public and online spaces. This not only affects those who have experienced such abuse but also women and girls more broadly.  

This silencing effect causes people to feel less safe participating online and exercising their freedom of expression. Further, deepfake sexual abuse has been used to target women public figures including journalists and politicians as a tactic to hinder their democratic participation 

Legislation that addresses emerging online threats 

New regulations framed in Bill C-63, the Online Harms Act, are a positive step. But they do not go far enough. To truly hold platforms and their (ab)users accountable, legal remedies to this form of violence specific to gender are required.  

The solution doesn’t end there. Prevention also requires educational programs about deepfake sexual abuse as a form of gendered violence be developed. In particular, youth should be engaged in early- intervention programs to teach them about digital safety and literacy in a way that empowers youth expression online while sharing the harms of technology-facilitated violence.  

And funding should be allocated toward addressing this new, insidious form of gender-based violence via organizations that are already supporting survivors of sexual violence and promoting gender equity.  

Without these additional steps, new legislation targeting online harms will fail to properly protect Canadians, especially women and girls.  

Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission, or a letter to the editor. 
Dianne Lalonde
Dianne Lalonde is a research and knowledge mobilization specialist at the Learning Network and a PhD candidate in political science at Western University. X: @DianneLalonde_ 

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License