Passer au contenu

The art of persuasion in the age of AI

GenAI may very well become a lawyer’s new best friend, but let’s not hand over the corner office

Closeup of a typewriter
iStock/4kodiak

Humanity’s capacity for creativity has long been seen as one of our defining traits, setting us apart from the rest of creation. Creativity in writing has deep roots, dating back to when humans etched poetic works like the Epic of Gilgamesh onto clay tablets—as ancient as civilization itself.

In legal writing, creativity often plays second fiddle to technical precision and logical rigour. Yet, we should never underestimate the weight of wit, the might of muse, or the power of pithiness.

Until recently, legal writing depended solely on the brute force of creative human intellect. Lawyers typed late into the night, fuelled by coffee, casebooks, CanLII, and confidence. But those days may be waning with the rise of generative AI (GenAI).

With tools like Microsoft’s Copilot nudging their way into the legal profession like an overeager articling student, we must ask: Is GenAI here to join the team, or is it taking over the firm?

If you’re a practicing lawyer (like me) or an educator (also like me), the rise of GenAI feels personal. On one hand, it offers valuable time-saving opportunities. On the other, there is an ancient tradition to protect—not to mention our sense of professional identity.

So, how should lawyers, educators, and other professionals navigate this new era?

The Promise of Generative AI

GenAI tools like Copilot or ChatGPT don’t think creatively like you or I do. They operate through algorithms trained on vast datasets to generate content based on patterns, predicting likely words and phrases without truly understanding, in a human sense, the context or meaning.

When Microsoft launched Copilot late last year, it promised to do for legal writing what the microwave did for leftovers: make life simpler. Copilot promises to assist lawyers with the heavy lifting of writing by automating many routine tasks. Integrated into Microsoft’s Office suite, it offers real-time suggestions that streamline repetitive writing tasks, like drafting pleadings or formatting documents.

If you’ve ever spent time wrestling with paragraph numbering or adjusting formatting to meet a page limit, you may have fantasized about having something do it for you. Copilot promises faster, more precise legal writing, allowing lawyers to focus on substantive tasks like crafting creative and compelling arguments, strategizing, and, of course, billing hours. It sounds like a win-win, right?

The Professional Quagmire

Not so fast.

While Copilot and other GenAI tools are exciting, they also raise significant ethical concerns.

You’ve probably heard horror stories of AI misuse. One major issue is the risk of breaching client confidentiality. GenAI processes vast amounts of data with little regard for ownership. Every interaction involves data being processed and possibly stored, raising questions about where this information goes.

Imagine using Copilot to draft a contract for a client with sensitive corporate data. Could that data be exposed or stored in ways that breach confidentiality? Lawyers have a duty to protect client information, and AI tools raise thorny questions about how that information is handled. Microsoft says it prioritizes privacy and confidentiality in Copilot, but this territory remains relatively uncharted. It’s not enough simply to trust companies’ promises—lawyers must remain vigilant, as they’re accountable to the clients, not the AI.

And what about a lawyer’s duty to act competently and with integrity? GenAI currently has a tendency to hallucinate and make things up. Cases in the U.S. and Canada show what can go wrong when a lawyer uncritically relies on AI-generated submissions, which might include fictitious case law.

My main concern, however, extends beyond technical issues, which will likely improve with future AI models. Legal writing is more than just a technical exercise; it’s a craft and, dare I say, an art. Effective legal writing requires human engagement on both ends of the equation—one to persuade and the other to be persuaded.

While GenAI can synthesize data and generate content, misuse and overuse could degrade one side of that equation, with negative repercussions for the profession and the administration of justice.

In persuasive legal writing, lawyers must remain both the gatekeepers and the creators.

The Educator’s Dilemma: To Teach or Not to Teach AI?

The rise of GenAI presents another complex dilemma for legal educators. Should we teach law students how to use tools like Copilot, or will doing so undermine the development of essential legal writing skills?

On the one hand, equipping students to use GenAI tools is a practical response to an evolving profession. Today’s law students will likely need to understand these tools as they enter the workforce. Knowing how and when to leverage GenAI may soon be as important for future lawyers as knowing how to note up a case.

On the other hand, many educators—including me—worry that over-reliance on GenAI could produce a generation of lawyers who excel at finding prompts for optimal AI output but lack the deeper understanding that comes from wrestling with legal principles firsthand. Legal writing isn’t just about generating content; it’s about creative mental rigor—the skill of building persuasive arguments, synthesizing precedents, countering weaknesses, framing facts, and thinking critically and creatively about the law.

There’s a very real risk that over-dependence on GenAI could atrophy the writing skills of a profession that depends on its ability to persuade through carefully crafted arguments. Legal writing, like any skill, improves through practice and hard work. While GenAI can save time, it may shield lawyers from the effort required for personal and professional development—an effort that should not be avoided.

The Verdict: A Cautious Embrace of the Future

So where does that leave us? GenAI tools like Copilot offer exciting potential for the legal profession. They can streamline workflows, reduce the drudgery of legal writing, and allow lawyers to focus on more substantive—and often more rewarding—tasks. For educators, these tools represent the future, and integrating them into curricula may be inevitable—but we must proceed with caution.

The risks are real. Lawyers and educators must ensure that AI remains a tool, not a crutch. Human creativity, judgment, and dedication should never be sacrificed to automation.

Ultimately, legal writing is not just about winning arguments; it’s about upholding traditions that date back centuries and preserving human capacities that date back millennia—not to mention protecting human rights and interests. GenAI may very well become a lawyer’s new best friend, but I caution against giving it the corner office.