Skip to Content

Q&A with Sinziana Gutiu

The Vancouver-based lawyer on why Canada’s legal privacy framework still has some growing up to do.

Sinziana Gutiu, TELUS, Vancouver
Jimmy Jeong

Sinziana Gutiu is the winner of the 2020 Douglas Miller Rising Star award, which recognizes developing talent and characteristics of leadership to the profession. CBA National caught up with the TELUS lawyer to talk about whether privacy should be recognized as a human right, how to prepare for the Internet of thinking, and some of the other existential challenges facing our digital society.

CBA National: How do you size up privacy law in Canada today?

Sinziana Gutiu: Honestly, it feels like privacy in Canada is in its awkward teenage phase.

N: How so?

SG: Well, we have federal/provincial legislation enacted for private organizations, public organizations, in some jurisdictions, for health. We have institutions oversight, even though privacy commissioners don't have a lot of power. Courts have started weighing into the dialogue. Mandatory breach notification has been put into place. So, privacy law is growing up, but it hasn't matured enough yet. We're facing, in Canada, some pretty key existential questions. For example, should we make privacy a human right? But then there are a lot of implications. If privacy were a human right, it would probably not fall under the scope of the Privacy Commissioner anymore. It would probably be under the Human Rights Commissioner. 

N: Why would that be a problem? 

SG: Because privacy is a pretty specialized area, and it touches on more than just human rights. It touches on consumer protection, for example. And it's interesting because the government said it wants to create a Data Commissioner too, but I'm not sure what they mean by that. I want to emphasize that an approach where jurisdiction over privacy issues are distributed among various regulators (Competition Commissioner, new Data Commissioner, CRTC, Human Rights Commissioner) can risk further diluting and fragmenting existing privacy protections. However, better collaboration between regulators on overlapping issues, such as the Privacy Commissioner working with the Competition Commissioner on competition or consumer protection issues, could provide a broader reach for privacy protections and give more weight to the enforcement process.

N: Why is it so difficult for us and governments to acknowledge privacy as a human right? 

SG: What's really fascinating about privacy is that it sits at the pinnacle of all of the rights that people enjoy. Like mobility: If the government is surveilling you and you have a GPS tracker on you at all times, that's going to affect your ability to go places. Freedom of thought and freedom of speech and freedom of association: There are things that you want to keep to yourself. Even the ability to vote. Think of Cambridge Analytica and Facebook, where people were getting misinformation in a way that was affecting their ability to decide for themselves. And even equality: To be equal, you have to have the ability to control information about you, also to be left alone to make your own decisions, and to put out there the parts about you that you want. It doesn't mean hiding your criminal records. Some things are clearly important. But when it comes down to it, privacy sits at the centre of most, if not all, the rights that we have in Canada. And if you just add privacy to the Human Rights Code, what does that achieve?

N: So, what are some other existential questions facing us?

SG: We're looking at whether or not a principle-based law in Canada – like PIPEDA – versus a more rules-based law – like GDPR – is more appropriate. We're still questioning the very framework of the law. The cross-border data transfer issue is still a big one. Last year, the Privacy Commissioner came out with a reinterpretation that you need consent to transfer personal information outside of Canada, before he backtracked. It was just really impractical for a consumer to be asked every time a company takes their data. Also, we're still wondering whether the Privacy Commissioner should have order-making powers. And we keep hearing over and over that the consent model is broken. I don't think consent is. We want to make sure we're not too paternalistic with our laws, where we say we know what's best for people — like, forcing people to give up their health information to help us discover the cure for cancer. Some people might want to [refuse], and they should have a right to do that, even if it's incredibly beneficial for society. The point is that we need some flexibility. There's just a lot weighing on consent, and we need more tools to ease the pressure off of it.

N: What are some of the other challenges facing the consent model?

SG: I think artificial intelligence presently poses one of the biggest challenges to the consent regime. A key benefit of AI is its ability to discover new and exciting purposes for personal information. However, it's not realistic to expect individuals to provide freely given and meaningful consent prior to their personal information being processed if they don't know what purposes they are consenting to. The GDPR permits the use of AI without consent if one of the enumerated lawful bases are met – for example, when it's necessary for the performance of a contract, necessary to protect vital interests, or necessary for the performance of a task carried out in the public interest or exercise of official authority. We have similar language in some of the provincial public-sector personal information protection laws that include certain "necessity" (and reasonableness) thresholds as a lawful authority. Perhaps in the context of AI, providing legislated options for a lawful basis beyond consent based on certain thresholds of "necessity" could be one alternative.

N: What would you most like to see in terms of fixing our legal privacy framework, what would they be?

SG: First and foremost, the commissioners need more resources and better enforcement powers. I know the pushback to that is, "well, the privacy commissioners are not even using all the powers they have. They don't really do audits, and they already get a lot of resources, and they're not using them right. And the truth is that just having [those enforcement powers] on the books says something. It sends a message as a country that this is something we care about. I think that would be a step towards maturity.

N: Looking ahead, what are some of the other privacy challenges we are going to face?

SG: To me, one of the key next big challenges to privacy is the Internet of thought, which combines AI, IoT, big data with sensors in the body. Essentially, the Internet of thought is having a cloud interface that gives people instant access to devices, to information, just by thought alone. We already see signs of this with pacemakers, for example. People can control them with sensors. It's still early days, but it's a very futuristic kind of idea where you're thinking something, and then you can turn on your computer with your thought. And so, when we create new laws, it's important to think ahead, and not be so focused on what the problems are now, but what they are going to be.  

N: What was it like being mentored by the late Ian Kerr?

SG: We were very close. He's the reason I got into this field. I actually went to U of O thinking I was going to do international law. And then, I met Ian and his whole team of researchers, and he took us to Google headquarters. We went to a robot lab. It was so cool, and he got us involved. We helped him with all these really cool projects like looking at body scanners and asking ourselves whether or not they do what they're supposed to do. Or are they just theatre, and is it worth the privacy invasion? He helped me get connected to some lawyers in Vancouver who I then worked with through the CBA. It's such a huge loss. He was one of those deep thinkers who had the ears of judges because he used to train them, and he was involved in the government's AI consultation or advisory panel. It gave me hope that these questions aren't just going to be brushed over; that the deep stuff is going to be looked at, and they're at least going to hear a perspective on how to try to do it right. And, of course, there are others like him, but Ian was unique because he brought together all sorts of different disciplines. He had a background in philosophy and went into law and then brought in the technology angle. And he lit that fire in the students. We used to call it the Kerr Nation. He influenced so many people to go and do amazing things.

This interview has been edited and condensed for publication.