Skip to Content

Privacy bill fails on meaningful consent

Proposed changes to PIPEDA are too permissive and fail to address existing harms in the current digital environment, according to Citizen Lab.

Crowded street at rush hour

The proposed bill to replace Canada's privacy law governing the private sector's collection and use of personal information advances the interests of businesses and government agencies, not individual citizens or consumers, according to a report released by the University of Toronto's Citizen Lab.

Bill C-27, which "would largely codify existing practices" under the Personal Information Protection and Electronic Documents Act, "as opposed to addressing existing and potential harms," should be redrafted to embrace a rights-based approach.

"Drafting legislation toward an economic purpose rather than a rights-based purpose can and has led to corporate-forward legislation that leaves behind individual and community privacy rights," the report reads. 

The report, authored by Miller Thomson associate Amanda Cutinha and senior research associate Christopher Parsons, begins by assessing the federal government's use of de-identified and aggregated mobility data obtained from Telus and data analytics company BlueDot to monitor the spread of COVID-19 during the pandemic.

Though the report concludes that the federal government, Telus and BlueDot were likely legally compliant, the authors raise several deficiencies with the law.

For starters, PIPEDA doesn't adequately protect long-term privacy interests. People assume that the risk of re-identification is generally remote, the report points out, but as technology evolves, the potential to re-identify data could change significantly over time. As more de-identified data is made available, it could also become easier to cross-reference and analyze data sets to re-identify them. The report points to several studies showing that such data sets are "under constant risk of being re-identified given new statistical methods, data sets, or technical innovations."

Also, under PIPEDA, individuals need not be informed of how their data is de-identified or used for secondary purposes. There is also a need for more transparency surrounding how information is shared between public and private entities.

Another problem raised by the report is that de-identified or aggregated data could be used to guide government decisions that adversely affect certain groups, some marginalized. "This may include policies that police low-income communities that travel often and work in-person amid lockdown rules or policies that affect individuals who access reproductive health care," the report offers as examples – the latter apparently evoking recent promises by politicians to render interstate travel for abortion illegal in some American states.

 

In addition, the law does not account for Indigenous sovereignty over its data, which places it in conflict with principles in the United Nations Declaration on the Rights of Indigenous Peoples, now being implemented under federal law.

Finally, PIPEDA generally lacks adequate enforcement mechanisms, as the Privacy Commissioner cannot make orders or impose administrative monetary penalties on those who violate privacy legislation. Individuals also lack a private right of action to enforce their rights.

Currently at second reading in the House of Commons, Bill C-27 would overhaul PIPEDA to make it more compliant with advances in digital technology. It would set up a new Personal Information and Data Protection Tribunal to hear appeals of decisions made by the Privacy Commissioner under the CPPA and propose new rules for artificial intelligence (AI) systems. It would also provide penalties for non-compliance. Organizations found guilty of an indictable offence would risk fines up to 5% of global revenue or CA$25 million, whichever is greater. It also provides for AMPs of up to 3% of global revenue or CA$10 million for specific violations of the new law, to be known as the Consumer Privacy Protection Act (CPPA). The CPPA would also grant individuals a private right of action against firms, but only after the Privacy Commissioner found that an organization has contravened the act.

The Citizen Lab's report offers 19 recommendations for changes to the law. Among them, it proposes to expand the right to sue to instances where the Privacy Commissioner has yet to make a finding under the CPPA. Another recommendation would do away with the tribunal and empower the Privacy Commissioner independently to investigate complaints, make orders, and impose AMPs, which under the current draft, he could only recommend. The report also recommends that Privacy Commissioner be empowered to set out regulations to ensure information is appropriately de-identified and to prevent data collection or use if not satisfied that the proposed benefit is proportionate to the adverse effect. The authors would also remove exemptions afforded to private organizations to treat de-identified data as equivalent to anonymous data.

The report also addresses the meaningful consent of data use for secondary purposes - including socially beneficial purposes. For those uses to be appropriate, "the social impacts of data sharing must be well-understood," the report reads. That would involve properly assessing the nature of privacy and equity interests and clearly defining the purpose. To use the data for secondary purposes beyond that understanding would require that organizations renew the knowledge and consent requirements of the people whose data is involved.