Skip to Content

Careful when regulating web giants as broadcasters

Bill C-10 is hardly Orwellian, but it is a clumsy overreach into regulating expression.

Web broadcasting concept
Photo by Franck on Unsplash

There has been a lot of rhetoric around Bill C-10, which amends the country's Broadcasting Act. One of the biggest concerns was that the Canadian Radio-Television and Telecommunications Commission (CRTC) would assume the role of government censor with respect to user-generated content on the internet.

That might be hyperbole, but lawyers and academics say that there are concerns around freedom of expression in the bill as it stands.

In a written statement to CBA National, Heritage minister Steven Guilbeault says that the bill aims to level the playing field between Canadian broadcasters and web giants that have been unregulated. The latter, he says, should contribute to the Canadian broadcasting ecosystem as they benefit largely from it.

Guilbeault says that the original version of C-10 dealt with social media in two ways: In section 2.1, it provided that individuals who upload content on social media platforms, such as Facebook or TikTok, aren't considered broadcasters.

"This means you and I can't be regulated by the CRTC," he says. "We've kept that clause."

The provision which has received more scrutiny is section 4.1, which initially excluded social media platforms from the category of broadcasters.

"Based on the testimonies heard during the committee process, we realized that that exclusion was too broad as it excluded platforms like YouTube when they act as music streamers," says Guilbeault. "That's why we've removed 4.1."

Various cultural groups across the country supported removing section 4.1, including the Alliance of Canadian Cinema, Television and Radio Artists – or ACTRA.

"ACTRA is of the view that individuals who use social media to transmit programs for non-commercial purposes should be excluded from the application of the Broadcasting Act," says David Sparrow, ACTRA National President, in a written response to CBA National. "This is why it supports the new section 2.1 in Bill C-10. Section 4.1, however, provided a blanket exemption to social media platforms – as distinct from their users – from the application of the Broadcasting Act."

The trouble, says Sparrow, is that social media platforms produce professional cultural content, like music. Also, Facebook has broadcast Major League Baseball games, Twitter has introduced live broadcasting, and both YouTube and Facebook have entered the scripted content market.

"While both have now stepped back from creating original programming, there is no reason to assume they may not resume creating scripted content in the future," says Sparrow. "If these or other social media platforms evolve to become primary distributors of professional content, it should be open to the CRTC to capture that activity under the ambit of the Broadcasting Act. "

Sparrow says that ACTRA supports giving the CRTC the broad scope to determine how and when to regulate social media under the Broadcasting Act, subject to transparent and thorough public consultations and hearings.

Indeed the CRTC is there to regulate content, and not what individuals can say, acknowledges Michael Geist, the Canada Research Chair in internet and e-commerce law at the University of Ottawa. But allowing the CRTC to impose conditions on the discoverability of programs, which helps users find content to watch, amounts to regulating speech, he says.

That’s because by scrapping section 4.1, the government is effectively creating a situation where the CRTC can establish requirements about user-generated content, says Geist – "how much of it has to be Canadian, what the feed looks like; any number of things that they might try to comply with their broadcast objectives."

And yet, TikTok and Instagram posts are a form of expression for members of an entire generation, says Geist.

As for platforms acting like broadcasters with scripted content, many are effectively hybrids that push a mix of scripted, curated and user-generated content. Geist says that user-generated content ought to be explicitly carved out of the CRTC's ambit. "I'm not sure that it's clear from the legislation that [it is]," says Geist.

Not that we should be hoping the government restores Section 4.1. Its major shortcoming is that it proceeds on the premise that online content is somehow the same as conventional broadcasting, says Geist.

Cynthia Khoo, a technology and human rights lawyer and researcher with Tekhnos Law in Toronto, adds that we need to remember that the Broadcasting Act came about in an era of limited bandwidth for television and radio. The government had little choice but to regulate a scarce resource, but the internet doesn't work in the same way.

What's more, the internet has done so a lot to promote media diversity, equal representation, political mobilization and civil rights, says Khoo. Bill C-10 would harm historically marginalized communities by taking the one channel they have to inject their voices, and close in on it with regulations that make social media companies responsible for user-generated content.

Khoo would prefer to see some form of content moderation, but not in the way currently contemplated by the bill, which ultimately would cause more harm than good.

Khoo says that while Section 2.1 maintains that individual users can't be held responsible for the content they upload as if they were a broadcasting channel, YouTube could still be responsible under Canadian broadcasting law.

"To give you a sense of scale, as of 2019, there were 500 hours of video being uploaded to YouTube every minute around the world," says Khoo. "This is what they would be required to be responsible for as a matter of law, and subject to penalties if they don't meet the right regulatory obligations. It purports to rely on the CRTC to restrain itself, but if that's necessary, then why give them the power in the first place?"

By being subjected to broad obligations for user-generated content, social media platforms have to regulate content in a certain way, says Emily Laidlaw, the Canada Research Chair in cybersecurity law at the University of Calgary. But the bill doesn't tell those social media companies how to do it.

"There is only one option for social media companies if they have to meet all of the broadcasting policy objectives and regulatory requirements for user-generated content, and that is heavy-duty regulation of that content," says Laidlaw. "That's where the free speech issue is."

Expect that to mean an increase in content moderation by those platforms, she adds, including forms of surveillance as the content is created.

"Social media companies would have to probably start using AI systems more actively for this type of content than they would have," says Laidlaw. "It means how the content is ordered and what is pushed to us, and what is fed to users in Canada is different, and it entrenches the really big players at the top because the only companies that can afford to design these regulatory systems is YouTube."

Guilbeault says that the government will be moving more amendments to ensure that content uploaded by people on social media won't be considered programming under the Act, and therefore sheltered from CRTC regulation.

"This bill isn't about what Canadians do online, it's about what web giants aren't currently doing," says Guilbeault. "It is of utmost importance to the Canadian music community that streamers like YouTube be included in the Act, considering that YouTube is the number one service for music streaming."

Geist counters that if the goal was to get these companies to pay, there were better mechanisms to doing so than creating the "fiction" that this is one large broadcasting system.

"That helps explain why some of this is breaking down the way it is."