A new privacy bill for our age
Bill C-11 aims to give more meaningful information to people about the collection, use and disclosure of their personal data, while giving organizations tools to be more competitive in the digital economy.
The federal government has introduced an ambitious new bill that aims to protect Canadians' privacy while promoting data-driven innovation. It also marks the first meaningful attempt in Canada to regulate the use of data in artificial intelligence.
"[The government is] trying to facilitate a clear path to using data responsibly for Canadian competitiveness," says Carole Piovesan, a partner and co-founder at INQ Data Law in Toronto.
It's the first significant effort to update Canada's privacy law in over 20 years. By introducing Bill C-11, Ottawa is signaling its intention to align its data privacy regime with the values promoted under the EU's General Data Protection Regulation – or GDPR – in force since 2018. It also aims to implement the principles outlined in the Digital Charter unveiled last year. "The big difference is that the CPPA still maintains a principle-based approach, and it's not as prescriptive as the GDPR," says Piovesan. "So it is giving some flexibility in the interpretation of the legislation in a number of areas."
If passed, Bill C-11 will give birth to a new Consumer Privacy Protection Act, hived off from the current federal private sector privacy law, PIPEDA.
Another major development is that Bill C-11, on paper, gives the Privacy Commissioner of Canada teeth by granting it order-making powers to enforce compliance. But the OPC will have to share power with a new Personal Information and Data Protection Tribunal that can impose penalties for violations under the new law. "So you have in place an additional layer of accountability," says Piovesan. Hopefully, separating order-making authority from the power to impose fines will serve as a meaningful check and balance on OPC findings.
For the more serious offenses, the tribunal can issue fines of up to 5% of an organization's global revenue or $25M, whichever is greater. For lesser infringements, penalties can represent 3% of global revenue or $10 million. Under the GDPR, fines for breaches can reach up to 4% of a firm's worldwide annual revenue in a worst-case scenario.
Those in contravention of the law face another risk. The bill sets out a private right of action for individuals—though triggered only in cases where there is a finding of a privacy violation by the OPC that is either not appealed or upheld by the tribunal. Similarly, the GDPR grants private citizens of the EU an active role in its enforcement.
What's more, there are also whistleblower protections under the new bill against firing or imposing disciplinary measures against employees for reporting, in good faith, violations under the law.
A bill for the digital age
Among Bill C-11's other features is the introduction of new consumer portability rights that ought to make open banking advocates smile. Under certain circumstances and at a person's request, organizations will have to transfer, in a relatively timely fashion, personal information they collected about the individual to another organization.
It also addresses the de-identification of personal information from a record or data set. Under the new law, it's clear that consent to de-identify personal information would not be required. However, an organization would have to document practices, subject to an audit by the OPC. The de-identified data could be used for research and development, for instance. "Under PIPEDA, it wasn't clear if you could de-identified data without consent and then use it," Piovesan explains. "Whereas now, we have better guidance that says you can be de-identified without consent. You can use de-identified data. You have to justify your use of the de-identified data."
In this regard, the bill aims to promote flexibility in data use, while ensuring certain protections remain in place. What's more, Piovesan says that it would appear that de-identified will still be captured under the CPPA, unlike under the current PIPEDA regime. "There's a recognition that there will be technologies that assist with de-identification," she says. "So the government wants some degree of oversight and justification for the use of that data."
One of the most exciting aspects of the bill, says Piovesan, is how it approaches data algorithmic decision-making and focuses on transparency. Again, at a person's request, organizations will have to explain automated decision systems that "make a prediction, recommendation or decision" based on individual personal information. On this front, says Piovesan, Bill C-11 appears to go further than the GDPR, which applies to decisions "solely made" by automated systems.
"If the interpretation of this legislation is that any automated decision system that is making a prediction about an individual has to be explained, that could be really broad if you're in the world of AI, because that's all of AI," says Piovesan. "Once you implicate an individual, then you have to have explainability and transparency requirements in place."
On the whole, Piovesan says that Bill C-11 introduces "changes to the consent regime that are quite interesting," namely by imposing on organizations an analysis to justify collecting personal information as a legitimate commercial interest. "So you need to do that analysis which could be subject to an audit," she says. "But if you do it in the appropriate context, then you don't have to seek consent" where it doesn't meaningfully protect privacy. "Everything we are doing now is digital, and it's becoming impossible to balance the OPC guidance on obtaining meaningful consent with the plethora of requirements that you need to shove into a privacy policy. The government is recognizing that there is consent fatigue and that not everything requires such robust consent. But where you are going to seek it, it has to be meaningful, and you have to use plain language."