Skip to Content

Deplatforming misogyny

A report by the Women’s Legal Education and Action Fund says platforms need to be accountable for the proliferation of online violence against women and girls.

Pam Hrick, Executive Director and General Counsel, LEAF
Pam Hrick, Executive Director and General Counsel, LEAF

How is it that a teenager can have her YouTube video quickly taken down because it contains a few bars from a Beyoncé song, but when intimate pictures of her are disseminated hither and yon without her consent, nobody does anything about it? Why do pictures of onions get swiftly removed for being “overtly sexual” while targeted, credible and sustained rape threats are not deemed to violate a platform’s terms of service?

A new report out by the Women’s Legal Education and Action Fund (LEAF) on deplatforming misogyny investigates these sorts of gaps in our legal framework and proposes 14 recommendations to make the internet safer for women and girls in Canada.

The report, funded in part by a grant from the Canadian Bar Association’s Law for the Future Fund, discusses the role of digital platforms in the proliferation of technology-facilitated gender-based violence, abuse and harassment (TFGBV). It calls on the federal government to hold digital platforms accountable through regulation or legal liability.

“We need to recognize that TFGBV is a problem and platforms play enough of a role that justifies imposing some level of accountability or liability on them,” says Cynthia Khoo, a technology and human rights lawyer who authored the report. “But that liability has to be done carefully so that we advance equality and we don’t accidentally end up having the law backfire on us.”

For instance, she says, a law that targets online harms in general – say, misinformation, bullying or cyberterrorism – could overdo it and get struck down on constitutional grounds for infringing on freedom of expression.

If there is a challenge against a law regulating cybermisogyny, courts will have to assess whether limits to free expression are justified under the Charter, and meet the proportionality test.

“If you’re trying to achieve 10 different things at the same time with one regime, then how do you know if it’s proportionate or not because the proportionality analysis is going to be so different for terrorism compared to the non-consensual distribution of intimate images,” says Khoo.

The LEAF report urges the federal government to create a regime specific to gender-based violence, or at least to limit it to online violence and abuse that is rooted in systemic oppression.

“We need to push back on whose expression we’re talking about.” says Pam Hrick, LEAF's executive director and general counsel , who was also the recipient of the CBA’s 2021 Douglas Miller Rising Star Award. Cyberviolence discourages women from participating more actively in public life online, she says. Though encouraged by governments tackling the issue of online harms, Hrick insists they need to “ground their responses and their actions in the knowledge of human rights experts, in the experience of those who have been hurt by technology-facilitated violence and those on the front lines who are dealing with this on a day-to-day basis.”

Another pitfall to avoid is creating a system that incentivizes platforms to err on the side of removing too much. There is a risk the government will enact a regime where harmful content is taken down, along with “a lot of legitimate, legal and beneficial expression,” says Khoo, “specifically those from historically marginalized groups.” She points to queer fan fiction or positive sex-education material aimed at trans youth as examples of content that can easily get targeted and taken down when platforms are regulated too tightly.

What we need are clear definitions of what constitutes unacceptable gender-based violence. Only then will content moderators have the right tools to make the appropriate decisions instead of relying on algorithms that can’t tell the difference between an intimate picture and a bag of onions.

The point of deplatforming cybermisogyny, it’s worth noting, isn’t to crack down on individuals who may post a sexist meme on their Facebook wall. It’s to prevent the proliferation of gender-based violence that otherwise gets normalized online. “It becomes a form of bonding with your friends,” Khoo explains. “And the other people who are harmed are almost just collateral damage in that social bonding and social competition.”

The other problem with normalizing cybermisogyny is that it can escalate quickly as people try to outdo one another. “That’s when we see it shift offline and that’s when we get things like Incel mass shootings,” says Khoo.

It’s important for people to understand that technology-assisted violence “causes harm disproportionately to women and even more particularly to those who have intersectional identities,” says Pam Hrick. “It’s not ‘just speech’ or ‘just online.’ It creates serious psychological harm, it impacts the ability of the people who are targeted to participate fully in the world at this point because there is no real division between what is ‘online’ and what is ‘offline’ anymore.”

Khoo and Hrick say they hope the report is to move the conversation away from the false debate opposing free expression versus objecting to harmful messages online. “What we’re trying to emphasize in this report is no, we’re not holding platforms liable for user speech. We’re holding platforms accountable for their specific role in facilitating what happens with user speech, and for creating these specific environments” where violence and abuse proliferate, Khoo explains. “It’s not holding platforms liable for a user’s speech and actions, it’s holding platforms liable for their own actions.”