Building trust in our justice system using technology
Legal tech can help improve access to justice, as long as it's aligned with the public’s needs.
It's an all-too-familiar lost luggage story with an unusual ending. On September 18, 2021, Jessica Kalynn traveled from Vancouver to Dubai on an Air Canada flight. Her luggage arrived nearly halfway into her six-day business trip. Kalynn asked for $2,120.67 in compensation, and Air Canada gave $500.
Kalynn decided to file a claim against Air Canada through the Civil Resolution Tribunal (CRT) in British Columbia. The hearing was held online, with Kalynn representing herself. On July 15, 2022, less than a year after the incident, the CRT ordered Air Canada to pay an additional $700 in compensation.
The unusual part of the story isn't so much that Kalynn got Air Canada to pay for her delayed luggage. It's the speed and ease with which her case got resolved. Even then, at ten months, it took more than the average three months typical for CRT case resolution.
It's a reminder, too, that use of technology has the potential to improve access to justice. But how many legal disputes can reliably be completed with that type of efficiency or low cost? And can it help ensure people have confidence in our legal system?
Preventative justice
The pandemic revealed how fragile or resilient our public institutions can be. After a rocky start, the rapid changes in the justice system, from online hearings to e-filing, significantly reduced costs and ease of use for at least some people in the court system, provided they had decent internet access.
Even then, it takes more than good broadband. The key to CRT's success is its relentless focus on the user experience. Since 2014, the CRT has had more than 27,000 legal disputes resolved, most of them small claims, impacting more than 50,000 people. Nearly 22,000 of those cases would've gone through the B.C. provincial court.
"We work using human-focused design and applying behavioural insights principles," says Richard Rogers, former chair of the CRT and acting executive director of its Residential Tenancy Branch. "We focus on our end users. We're not building it for lawyers. We didn't do extensive consultations on system design with lawyers and judges because we weren't designing it for them."
When someone is looking for a lawyer, one of the first questions they ask is what their chances are of winning their case. The CRT offers users a Solution Explorer tool that explains a person's chance of success in a claim, and provides legal information about their issue. It then gives users options on how to resolve the issue on their own. Rogers says the next big breakthrough could be using AI as a "driver" in assisting people in the legal process.
"Eighty-five percent of Solution Explorer uses are based on small claims or strata issues," says Rogers. "If we can use AI to link Solution Explorer to legal decisions in those areas, that's an opportunity. We can have it say here's how the CRT decided the issue in similar cases, what's the evidence required to be successful on this issue and the user can consider whether the law supports them."
By putting the public needs front and centre, this allows people to access the legal justice system on their terms. It's something Shannon Salter is focused on. As deputy minister of justice, part of her role includes fostering innovation in the justice system. Her eight years as head of CRT gave her insight on how to bridge the gap between efficient, effective services and ensuring that the public is happy with the process.
"When we were building CRT, the biggest concern was being flexible and offering choice," says Salter. "We didn't want people to feel forced to use it but for it to be convenient to use. If it was designed badly, people wouldn't want to use it. Tech isn't about getting everyone online. It's about finding pathways that are accessible."
Creating public policy
Not putting the public first can produce devastating outcomes. In 2013, the Netherlands began using an algorithm to detect fraud in childcare benefit applications. It led to 20,000 families being falsely accused of fraud and ordered to pay hundreds of thousands of euros. An investigation eventually discovered that nationality, in particular citizenship status, was one of the determining factors used in the algorithm. Amnesty International slammed the program, calling it another form of racial profiling. Some parents even lost custody of their children. Some victims died by suicide. After a parliamentary committee report found the government MPs, judges and court administration had violated the rule of law, the country's cabinet resigned over the scandal in 2021.
The fallout is ongoing. In a 2022 article from Politico, Chermaine Leysner, one of the victims waiting to be paid back, said, "If you go through things like this, you also lose your trust in the government. So it's very difficult to trust what [authorities] say right now."
The Dutch story is a cautionary tale. But that doesn't mean we must refrain from using technology. Taking that approach would also undermine public trust in aging legal institutions.
Karen Eltis, a law professor at the University of Ottawa and a part its Centre for Law, Technology and Society, says the time has come for the courts and government to tackle the complexities of technology and justice.
"We have been blindly deferring to technology because of discomfort and trepidation," says Eltis. "We need to empower people to do critical thinking when using technology. For example, with Waze, when it says to turn left, we just follow it even if we know we should turn right. We even have algorithms telling us what articles to read. As a society, we're checking out of these decisions. Lawyers have a professional responsibility to think critically and be mindful about technology. Tech is a tool, not a crutch."
The Law Reform of Ontario (LCO) 2022 report, "Accountable AI," calls for the provinces and the federal government to commit to trustworthy AI for government systems. The goal is to move away from dealing with AI issues on a case-by-case basis and adopt policies that make the government and the courts accountable when using AI. The report offered 19 recommendations, including establishing a trustworthy AI framework in legislation, ensuring administrative tribunals and courts adopt it, developing performance metrics, and creating a dedicated criminal justice AI framework.
"We postponed these questions about framework and governance in the past," she says. "With the pandemic, we have thrust conservative institutions into the digital age. The legal system is going through a transformative change similar to the Industrial Revolution. Now is the time where we have to deal with the issues."
The federal government has a limited approach to the trustworthy AI framework through the Treasury Board's Directive on Automated Decision-Making. The directive has its pros and cons.
The LCO report found the directive has administrative law gaps in not explicitly requiring reasons and not affirming an individual's right to a hearing. However, the directive addresses systemic bias.
One major issue is the federal directive does not cover criminal law. The use of AI in criminal proceedings is highly problematic. Last year, through a joint investigation, privacy commissioners from British Columbia, Quebec, Alberta and the federal privacy commissioner found that Clearview AI violated privacy law. The U.S. based tech company collects online images and allows law enforcement to use its facial recognition software to identify people. The RCMP along with Calgary Police Service and Toronto Police Service used the software. While Clearview AI no longer operates in Canada, the incident is a clear sign for the need for more accountability.
"We have to align technology with our democratic values and framework," says Eltis. "Tech is a tool, not a crutch. We're not catching up, we're letting it lead us. We're letting the gap increase and trust decrease. Technology bypasses legal protections."
The use of AI in the courts remains a longstanding issue. Tech companies continue to sell artificial intelligence as a way to make smarter, quicker decisions by algorithms analyzing data to determine outcomes. The problem is AI is being presented as a catch-all solution. Eltis says decision makers need to "think fast and slow" when deciding how to use AI.
"When judges use AI, they're afraid to go against the algorithms," says Eltis. "We need to think through when we want to use AI. For example, traffic tickets is something were we can think fast about because those decisions are usually quick with low stakes. We need to think slow about cases that need discretion like the Persons case. That case was an outlier and focused on constitutional law. If we ran that case through an algorithm, it would uphold the law. Those type of issues need to be dealt with by ongoing human oversight."
Quebec is one of the leaders in legislative reform. Bill 64 modernizes Quebec's privacy laws following the European Union's General Data Protection Regulation (GDPR). The bill includes having the right to be forgotten and gaining consent to use personal information. The big shift involves automated decision-making. Individuals will be informed when a decision is made and will have the opportunity to correct any underlying personal information used to make the decision or to submit information as a way to appeal the decision. Individuals will also have the right to get information about how the decision was made.
What makes technology so tricky is not only the speed of change but our awkwardness with it as a profession. Even with the rise of legal tech within the past 25 years, there's still work to be done in making lawyers feel more comfortable about adopting and managing technology. "It's not a tech problem - it's a people problem," says Eltis. "People aren't comfortable overruling technology. They're embarrassed about what they don't know, and that makes them feel uncomfortable. We need basic tech competence to align with the values of law and critically analyze how the technology changes us."
The future of public legal tech
The hope for the future lies in reinventing legal systems tailored to the public's needs, where human rights, privacy and due process are encoded into the system. Valentin Callipel, a project manager at the Université de Montréal's Cyberjustice Laboratory, says the future of technology and the law lies in empowering people to make informed decisions about legal disputes.
"We talk about improving case management, e-filing, remote hearings for the administration of justice, but we haven't talked about technology to help the end users," says Valentin Callipel, Project Manager at Cyberjustice Laboratory. "Instead of e-filing, how can we leverage ADR (alternative dispute resolution) as a better source of information and problem resolution for the public? We have the opportunity now to think about the role of justice in people's lives."
In 2016, the Cyberjustice Laboratory created an online dispute resolution assistance program as a pilot project for Quebec's Office of Consumer Protection. More than 11,000 people have been referred to the program, with more than 5,000 files settled. The program has a 90% satisfaction rate from consumers and businesses, according to Callipel, and the average time to reach an agreement is 25 days.
"We need more accessible online systems, and that will build trust with the public and interactivity," says Callipel. "Right now, people have a legal problem and have to go to a particular place. Usually, they don't do it and they don't have their day in court. People want to be able to fix their justice problems on a Sunday night before going to bed. That type of service builds relationships with the public."
Last year, the Cyberjustice Laboratory collaborated with the Administrative Housing Tribunal to create JusticeBot, a program where people can ask legal questions about housing. Launched last summer, the website has had more than 7,700 users and generated more than 34,000 responses. "The best way is to work [with government] to have a government-wide strategy and to have a clear vision," says Callipel.
Ultimately, it's only by engaging the public, that we will be able to earn its trust that our justice system is in a position to their needs in a way that's accessible, affordable, and protects their individual rights. "Empower the people," says Callipel. "Give them the ability to understand the law and enforce the law. This is the future of access to justice."