Skip to Content

How law students are learning about AI

Queen's Law is at the forefront of the race to reform legal education.

AI highlighted

Cavina Tsoi was curious about artificial intelligence. Even though her background was in labour and employment, the Queen's University third-year law student wanted to do something non-traditional. She signed up for the Conflict Analytics Lab (CAL), a one-year practicum course to work on tech projects using artificial intelligence.

"The program helps me gain comfort with the technology," says Tsoi. "AI is growing, and the program gives me the building blocks for its continued use and growth into the future."

As generative AI tools like Open AI's ChatGPT continue to grow in the legal industry, the lingering question is how law schools will adapt. Law schools are traditionally not known for innovation in teaching law students practical legal skills.

But Queen's is already making an impact. On February 15, CAL announced that its open source legal AI tool, OpenJustice, is available for free to all Canadian law schools. OpenJustice is a generative AI tool designed to analyze legal documents and answer legal questions.

"We want a large language model that is accessible to people who work in access to justice," says Samuel Dahan, director of Conflict Analytics Lab and law professor at Queen's and Cornell University. "Having an open source model allows us to give access to legal aid organizations, the courts and lawyers. We want a tool that's available to every lawyer in the world."

Most discussions about law schools and generative AI are squarely on whether law students can and should use ChatGPT. What makes CAL unique is its interdisciplinary work. The program is a collaboration between the law school and Queen's Smith School of Business.

Dahan says the key is to have a program that doesn't solely focus on coding or a specific tool.

"The goal of our lab is to train students to build, use and evaluate technology," says Dahan. "Lawyers need to know how to work with vendors, whether it's customizing technology for your firm or if you're working as in-house counsel and buying software."

Law students have the chance to work with business and computer science students on a daily basis. Tsoi, who is working on OpenCourt, is classifying employment cases and communicating with computer science students about how information should be organized. OpenCourt, an open source tool for the public to help them navigate employment issues such as how much a person is entitled to if their employment is terminated or if their layoff from work is legal. Deel, a global human resource tech company, announced last year that they would use CAL's AI technology from OpenCourt to classify whether workers are independent contractors or employees.

"Law is a very particular area where one word in a sentence can change the meaning entirely," says Tsoi. "There are different considerations in applying AI to the legal field, for example, the length of the entry of a field can be important. Even entry of a summary can result in vastly different results."

Because Tsoi is working with the technology, she's learning foundational skills about generative AI, including the ins and outs of prompt engineering and determining what constitutes a good response.

"It's interesting to learn what kind of prompting helps to get the best results," says Tsoi. "We're also looking at what makes a good legal answer versus asking a lawyer. I don't have a tech background and this has given me the building blocks to work with AI models into the future."

Kieran Woboditsch-Velasco, a third-year JD/MBA student, was drawn to CAL because of its work in access to justice. He noticed his fellow MBA students using ChatGPT last year and began to use it last summer.

"They're trying to increase accessibility in legal services with OpenJustice," says Woboditsch-Velasco. "In my ethics class, I heard about how people with high income can get legal services and on the other end for people who can't afford legal services they receive legal aid. Then there are a lot of people in the middle who can't get legal services. How do we solve this problem? Offering free legal information can help."

Woboditsch-Velasco, along with another JD/MBA student, is working on business development for CAL. They are developing a case competition where law, business and computer science students will come together to solve a problem and use OpenJustice as a guide. They're also working on a student ambassador program where law students advocate using AI tools.

"I've learned so much from this program," says Woboditsch-Velasco. "You may have a good idea but have to collaborate and execute it very well. You learn to lean on people who have tech expertise. CAL takes you in and makes you feel comfortable. When it comes to the business side, they told me we're on equal footing and that feels great."

The legal profession is still learning what is generative AI and how to use it. David Liang, a young lawyer who works as a program manager for CAL and operations manager with Deel, thinks we still have a long way to go. He and his colleagues recently gave a presentation at the Washington State Bar Association about the potential of using generative AI. In the middle of the presentation, a lawyer asked what the difference is between a Google search and generative AI.

"That shows you the lack understanding of how generative AI works," says Liang. "We talked about how generative AI like ChatGPT is good at making a first draft and can help junior lawyers more efficient. We need to improve legal technology."

Liang didn't plan on working in legal tech. In the past, he was interested in technology, and at one point, he considered whether to go to law school or pursue a computer science degree. When he came to Queen's University Faculty of Law in 2018 as a first-year law student, CAL had just been launched, and he knew he wanted to join. While in law school, Liang worked on the employment algorithm, classifying hundreds of cases for computer science students.

After graduating in 2021, he articled at a boutique tax law firm where his experience working with technology was quite different from his time at the lab. When CAL approached Liang in 2022, asking to join them full-time, he jumped at the chance.

"I see the processes firms use as archaic," says Liang. "As I was working, even making automated changes like improving Excel spreadsheets was difficult. That's where I was inspired to go all in. I'm interested in legal tech and would rather be a part of the disruption. I don't think I would be in legal tech if not for this opportunity."

Dahan wants to expand these types of opportunities to other schools. Last year, Queen's University entered into a partnership with Paris Dauphine University, where LL.M. students will work to add European Union legal data to OpenJustice to make the tool more accessible to people in Europe. Law students from Paris Dauphine University are also working with Queen's University law and computer science students on creating a system to judge the validity of digital evidence using OpenJustice. The system is also being developed for the International Court of Justice to combat deepfakes.

"This is an opportunity for law students to see how research directly affects the law," says Dahan. "We want to establish standards on what's good for legal AI."

Dahan says for the legal profession to be ready for the future, it requires law students knowing how to use AI. Even though Tsoi will be headed towards a career as a labour and employment lawyer, she says what's she learned will be instrumental in her practice.

"Artificial intelligence is not going away," says Tsoi. "We're doing law students a disservice if schools avoid this topic. Generative AI can be and should be treated like a tool that can help lawyers work better and faster. It's not something we have to scared of."