Passer au contenu

Final debate approaches on troubling age verification bill

Critics say what’s proposed is technologically unfeasible and raises considerable privacy concerns

Facial recognition technology on phone
iStock

The House of Commons is nearing final debate on an age verification bill that critics say includes provisions that would be far more restrictive than any measures adopted by conservative American states.

The proposed legislation, which came to be in the Senate, would mandate age verification for any organization that makes available sexually explicit material on the Internet for commercial purposes.

Civil society and legal groups have warned that Bill S-210 is overbroad, and almost certainly technologically unfeasible to implement. Furthermore, it was the victim filibuster tactics during committee study where no critics of the bill were made available as witnesses.

“It’s appalling that such a potential consequential and poorly thought-through bill got such brief study—almost nothing substantive at all,” says Aislin Jackson, policy staff counsel at the BC Civil Liberties Association.

She says it’s ironic that the Conservatives have been alive to the privacy concerns around Bill C-27 but were hostile to studying similar concerns in this bill.

Jackson says Sen. Julie Miville-Dechêne, who introduced the bill, appears to have worked with the age verification industry when drafting the legislation and took its assurances around things like age-estimation AI as being workable without privacy infringement.

Collecting the biometric information needed for this kind of age verification is inherently an invasion of privacy.

“It’s not going to be perfect—it’s just age estimation, so there needs to be some sort of appeal process where you can actually check somebody’s age to ensure that people who are adults are not being cut off for perfectly legal material just because they have a baby-face or the AI hasn’t been adequately trained on images from their ethnic group.”

That would mean they’d need to show their identification to see not only pornographic content but “any material that could be used to groom a child to be more vulnerable” because it draws on a section of the Criminal Code that provides the broadest possible definition of explicit material.

Matt Hatfield, executive director of OpenMedia in Vancouver, says that these definitions are not fit for service in this legislation.

“They’re Criminal Code definitions that apply within a different context within the Code,” he says.

Jackson notes materials that would not be harmful if a child encountered it under their own self-guided exploration could be harmful in the context of an adult trying to normalize discussing these matters with children inappropriately.

This could wind up including a great deal of content that is not necessarily explicit or adult. Jackson notes that the BCCLA’s experience with the Little Sisters case has shown that in a customs enforcement environment, innocuous LGBTQ+ materials were deemed to be “obscene” materials to be seized. She says it’s a real concern that any minoritized sexual expression will be seen as more explicit and concerning than it actually is. It would have a differential application in this age verification context as well.

“(The legislation is) either going to age-gate large slates of the internet for everyone, or there’s going to be a double standard where the content that relates to certain minoritized sexual identities is going to be age-gated in a way that any normative, heterosexual content is not,” Jackson says.

 “That creates an equality issue, and that’s because the definition is so broad.”

Pam Hrick, executive director of the Women’s Legal and Education Action Fund (LEAF), worries that in addition to silencing already marginalized voices through over-moderation, the bill could have unintended consequences for victims whose intimate images are being circulated online.

“The bill’s proposed measures would bar many victims from confirming if their image is posted, gathering evidence for legal recourse, and obtaining the URLs to make takedown requests,” she says.

Raphael Vagliano, a legal officer at the Centre for Law and Democracy, says there are concerns about how unclear the bill is regarding which organizations will be scoped into the law—a concern nearly all critics of the bill share.

“It refers to organizations which make this content available, and it’s not exactly clear in terms of who that would apply to,” he says.

“There need to be safeguards because otherwise you are scoping in things like search engines and internet service providers, which would be quite problematic. The end result would be a requirement to verify age for all of these services, which would be a disproportionate infringement on freedom of expression and access to information.”

Providing website blocking as a remedy without adequate safeguards is also questionable.

Hatfield notes that because there are no penalties for taking down lawful content, the rational thing for most companies to do will be to take down materials pre-emptively to be safe.

Jackson says the bill doesn’t consider the expressive rights of the content creators but talks about the people who host it online and the state. It’s not obvious to her if the artist who bought server space to set up their own gallery space with their own URL is the operator or if it’s the commercial service provider that will have the procedural rights to defend the expression.

“It’s not clear that it would be the individual artist. That represents a real problem because the people who have the rights to defend the content are not the people who have the greatest stake in that expressive content being accessible to the world or the best evidence about the artistic value of the content,” she says.

There are also no procedural rights for the people whose expressive content will be walled-off or denied under this legislation.

“It’s really a dog’s breakfast,” Jackson says.

The age-verification technology the bill points to in future regulation also raises concerns. Among them is the lack of checks to ensure the technology will actually respect privacy.

“No matter what technology is or is not available, the bill comes into effect automatically a year after it passes, which is one of the problems,” Hatfield says.

“When Australia and France’s regulatory agencies looked into this, they determined there wasn’t a technology that would meet their standard, so they have delayed enforcement of their laws until there is. If Canada had that ability to look into it and decide to hold, it would be a less problematic bill.”

In a post last year, Michael Geist, the Canada Research Chair in Internet and E-commerce Law at the University of Ottawa,dubbed S-210 “the most dangerous Canadian bill you’ve never heard of.”

He says it became clear during the brief committee hearings that there were limited privacy safeguards the proponents might otherwise suggest. When it comes to technology, the bill relies on a “nerd harder” mentality.

“They reference this idea that the tech people can just fix this,” Geist says.

“It may be in the future that some of the technology that is envisioned will be more effective and will be able to address some of these issues, but at the moment, this kind of technology largely boils down to two mechanisms—requiring users to upload highly sensitive information like government-issued IDs as proof of age verification […] and age estimation, where you engage in facial scanning.”

Given the number of privacy breaches, a site where users upload their government IDs would be a tempting target for cybercriminals and cannot be equated to what happens in an offline context.

As for age estimation, Geist says while it may be effective in distinguishing between a 24-year-old and a 16-year-old, in this context, it would be asked to distinguish between a 17-year-old and an 18-year-old.

The Supreme Court of Canada’s recent decision in Bykovets raised the temptation for cybercriminals to access this data. That’s because these measures implicitly become private surveillance, which creates “pools of data” that can identify individuals. The most privacy-protective thing is not to collect that data in the first place.

“If we’re going to collect this information, which is either people’s identification, their birthdates, their biometrics, it’s a huge privacy vulnerability, and we need to ensure we have a very good reason to collect that information,” Jackson says.

Because the bill has passed at committee with no changes, the ability to amend the bill now is virtually exhausted. Geist believes that’s all the more reason for it to be voted down. Vagliano agrees.

“I think it should be killed,” Geist says.

That’s not to say there aren’t real issues or value in addressing them, potentially in other legislation that’s before Parliament.

“It’s a fine conversation, but this bill as currently drafted raises enormous concerns.”

Hatfield suggests Bill C-63, the Online Harms Act, contains age-appropriate design provisions, and could be a more modest and appropriate way of looking at the issue.