University of Waterloo

04/16/2024 | Press release | Distributed by Public on 04/16/2024 07:41

Q and A with the Experts: Legal implications of generative artificial intelligence

This interview has been condensed. Read the full article on the Cheriton School of Computer Science website.

Generative AI (GenAI) is a subset of artificial intelligence that creates content increasingly difficult to differentiate from what humans create. The GenAI text reads well, the photos look authentic, the audio files sound real, and the videos look convincing. Professor Maura Grossman, from the Cheriton School of Computer Science at the University of Waterloo, is principal at Maura Grossman Law, an eDiscovery law and consulting firm, and answers some common legal questions about GenAI.

Does GenAI provide challenges to the justice system?

Yes, because we need to determine if purported deepfake evidence should be admitted in civil and criminal trials. Part of the challenge is that the admissibility standard is low. Evidence only has to meet a preponderance, meaning it is more likely than not what the proponent says it is. Someone can play a recording of your voice in court and I can testify I've spoken to you many times and know it's you in the recording. But just because it sounds like you doesn't mean the recording is of something you said.

Can lawyers and litigants use ChatGPT to prepare court filings?

Yes, both lawyers and self-represented people have used ChatGPT to prepare filings. One problem is that GenAI can draft briefs with citations that sound authoritative but are not real. On the positive side, people who can't afford a lawyer can use GenAI to generate customized legal papers specific to their circumstances and location.

Can judges and their staff use GenAI for research or to draft opinions?

At least three judges have used GenAI to draft opinions. You might think, what's the problem since GPT-4 has passed the U.S. bar exam. The concern is that ChatGPT can provide different answers to the same question, not to mention hallucinate false information.

This series is produced for the media, and its purpose is to share the expertise of UWaterloo researchers. To reach this researcher, please contact media relations.