Skip to Content

Responding to hate speech online

The CBA comments on a consultation paper from Justice Canada about legal remedies for victims of hate speech.

Man seated in front of computer screen at night

Back in the spring of 2008, when Canada was going through one of its periodic eruptions over the limits of free speech, David Matas was calling for a measure of perspective.

 

"The mere fact that you've got a legal system that allows for a complaint which is maybe wrong doesn't in itself invalidate the system," the high-profile lawyer and human rights advocate told the National Post. "If somebody tries to hit you with a chair, you don't blame the chair."

 

More than a dozen years later — and more than seven years after Section 13 of the Canadian Human Rights Act, dealing with hate speech, was repealed by Parliament — that debate is ready for an encore. Last year, a House of Commons committee recommended the government hold consultations on a Section 13 replacement. Recently, the Canadian Bar Association — through its Constitutional and Human Rights, Criminal Justice and Sexual Orientation and Gender Identity sections — completed its submission to the consultations.

 

The recommendations in that submission are both more cautious and more aggressive than Section 13 itself was — building guardrails around applying the civil remedy for hate speech even as they target the online platforms that spread those messages of hate.

 

"Everything being suggested in this paper is possible. Nothing in it is legally suspect at all," says Mark Freiman of Rosen & Company, who has argued multiple human rights cases in Canadian courts and in front of the European Court of Human Rights. "It's merely a question of political will."

 

Section 13 was repealed in large part because its critics said it encouraged frivolous complaints. The CBA paper proposes bringing Section 13 back on a leash — with new rules allowing human rights tribunals to screen out weak cases (to "compensate for the lower standard of proof" in civil proceedings) and award costs to defendants.

 

"The point is that people should not be put to great expense to defend themselves from claims that they did something they didn't do," said Matas, who helped to assemble the CBA submission.

 

The submission proposes tasking the Canadian Human Rights Commission with pre-screening complaints and says complainants should be compelled to file their complaints in only one jurisdiction at a time. It says the right to file anonymous complaints should be eliminated for all but the exceptional cases involving personal risks to the complainant.

 

Those changes might mollify Section 13's critics. They won't have the same effect on internet platforms — which would lose their civil immunity from hate speech complaints if the government follows through on the CBA's recommendations. The old Section 13 excluded internet service providers (ISPs) from its scope. "A re-enacted section 13," says the CBA paper, "should expressly say the exact opposite: when an internet provider allows a person to use their services, the provider is communicating what the person posts on the provider's platform.

 

"Internet providers should not have civil immunity for the material on their platforms."

 

The idea of taking on some of the wealthiest corporations on the planet ought to seem less daunting now, says Matas, because the political climate around online hate speech has shifted. "In the U.S., there's been a push recently to amend the law to eliminate immunity," he says. "What we're asking for is no different than what the ISPs themselves commit to in their terms of use.

 

"It's not that the ISPs are against this in principle. It comes down to defining hate speech and dealing with the sheer volume of material. But they all say they will respect local laws, so I can't imagine any ISP deliberately violating a tribunal order."

 

The CBA paper nods to the difficulties involved. Online platforms monetize content and can be unwilling to take even the awful stuff down. The amount of online content is vast, and policing it takes skills these companies don't have and resources they're reluctant to spend. The paper says the Human Rights Commission should set up a network of "trusted flaggers" — third parties charged with identifying hateful content — like the one launched by the European Union to keep an eye on Facebook, Twitter and others.

 

If the civil remedy for hate speech in Section 13 did too little to weed out weak cases, says the CBA paper, the remedy in the Criminal Code "goes too far in the other direction and does not catch enough incitement to hatred." The Code requires the attorney general's assent before the Crown can pursue a case of criminal incitement to hatred. The CBA paper says AGs should be compelled to state reasons for withholding that consent.

 

"There is an absolutist free speech tradition in the States and, to a degree, here in Canada," says Matas. "What we don't want is an AG denying consent on that principle without stating clear reasons for withholding consent. The guidelines are not binding. But if you're not going to follow through, you should be obliged to say why."

 

The paper says platforms should be able to avail themselves of "a defence of innocent dissemination" in criminal proceedings, but "internet providers should be liable for noxious content that is not innocently disseminated." (It also suggests the "innocent dissemination" defence should be reserved for internet providers that have functional complaints systems and respond to hate speech reports promptly.)

 

The Criminal Code section on "incitement to hatred" exempts private conversations from its ambit. The CBA paper says that's an oversight — that a private conversation that incites "grave acts of violence" shouldn't be placed beyond the law's reach simply because it happens in private: "The right to privacy should not trump the right to freedom from incitement to hatred. Like all other rights that may clash, they need to be balanced."

 

In short, the CBA paper calls for an approach to online hate speech that prioritizes its effects over its context. That's a necessary evolution in the law's approach to hate speech online, says Freiman — because the old 'sunlight is the best disinfectant' approach isn't working.

 

"I don't think enough people have woken up yet to how the internet has changed the world. We're seeing the emergence of bubbles of alternate reality, of alternate facts," he says. "People like to say that the cure to bad speech is more good speech. But these conversations aren't taking place in the marketplace of ideas — they're taking place in these alternate reality zones where conflicting facts can't break through.

 

"We've built up a kind of world where the individual's sense of 'common sense' makes it impossible for them to see things are they are. And nowhere is this more apparent than in the toxic universe of Fox News."