Thinking about tech's implications for justice
We'll need to mitigate the unintended consequences of modernizing the courts.
Triage. Framework. Technology. Those three words came up repeatedly in University of Ottawa law professor Karin Eltis’s presentation in July to the CBA’s COVID-19 task force.
According to Eltis, the proper deployment of all three will be vital to the renaissance of the justice system, which was challenged by the measures put in place to deal with the pandemic.
There can be no turning back to the way things were done before the pandemic. “We must march forward,” she said, but added, per Richard Susskind, that our solutions mustn’t perpetuate our old inefficiencies.
That means creating a framework for technology to ensure it enhances access without undermining it, says Eltis, whose research focuses on AI, cyber-security and privacy. So far the response to COVID-19 restrictions can best be described as crisis-driven, using available technology as a crutch in a period of uncertainty. In order for change to be sustainable it must be purposeful, with appropriate thought given to potential unintended consequences and how to mitigate them.
Part of the framework has to be a consideration of hybrid solutions to justice system problems, and that means establishing a process for effective triage: deciding when technology is suitable, when it is not, and when the best answer is a combination of the two scenarios.
Another part of that framework involves deciding what technology to use, and how.
We can’t simply transpose “bricks and mortar” processes to virtual courts, says Eltis. There are important questions about how to treat personal information – questions that were also raised in a CBA submission about safeguards for information published online by the courts.
Using private platforms to administer public justice raises its own questions about judicial independence, and to what extent courts will delegate work to the private sector. The justice system needs to be particularly careful about collaborating with private companies whose business model is data collection.
“It’s very tempting to just resort to a private platform because the infrastructure is there and it’s convenient,” says Eltis. Ultimately, “We can’t leave courts to fend for themselves and say, ‘this platform is free’,” Eltis says. “If something is free it’s because you’re the product. The justice system and justice service can’t afford to be the product.”
One of the “issues we must struggle with” is determining who gets to shape the technology we can use, and controlling the biases that may be built into that technology. For example, in the U.S. AI can be used to move court processes along more quickly, informing decisions about bail sentencing and parole, to reduce backlogs. But cases such as Wisconsin v. Loomis emphasize the need to be aware of how the algorithm is programmed to come to its decisions.
“We can’t move fast and break things,” says Eltis. “We can move fast but we must be very mindful of the implications.”