Help power Marketplace this winter when you support the show today. Donate Now!

There’s evidence that AI has downsides for the legal industry

Matt Levin Jul 19, 2023
Heard on:
HTML EMBED:
COPY
Sam Altman, CEO of OpenAI, which makes ChatGPT, testifies before a Senate panel in May. Artificial intelligence may reduce employment in the legal field and disrupt how practitioners verify facts. Win McNamee/Getty Images

There’s evidence that AI has downsides for the legal industry

Matt Levin Jul 19, 2023
Heard on:
Sam Altman, CEO of OpenAI, which makes ChatGPT, testifies before a Senate panel in May. Artificial intelligence may reduce employment in the legal field and disrupt how practitioners verify facts. Win McNamee/Getty Images
HTML EMBED:
COPY

Caretha Perry has been a paralegal for more than 20 years, mostly with a legal division of the U.S. Army. Generative artificial intelligence wasn’t really on her radar until a couple of months ago, when she took a “critical thinking” workshop the Army was offering.

Her instructor had ChatGPT up on the classroom projector and told it she wanted song lyrics that were a blend of Madonna and Stephen King.

“It started creating this Stephen King-ish Madonna song,” Perry said. “It was pretty good, what it was typing, you know?”

Most amazing was just how fast the AI was writing all of this.

“And I just thought, ‘This is replacement,'” Perry said. “‘This would replace Madonna, this would replace a Stephen King, this is gonna replace a lot of people.”

Including, maybe, her.

Paralegals don’t typically spend their workdays creating Madonna-Stephen King mashups.

But a lot of Perry’s job is writing and research. Looking up a statute or case law, seeing if it’s relevant to what her boss wants, taking a first pass at a brief. Those are the parts of the job she enjoys the most, and the parts new legal AI tools are already pretty good at.

Her boss has joked about this with her.

“He had already said, ‘I just open up the software, I type in, I want cases, statutes, you know? I’ve got it instantly. No need for you,'” Perry said.

A recent Goldman Sachs analysis estimated that 44% of work tasks in the legal industry could soon be performed by AI.

Of course that 44% number is an educated guess.

Toni Marsh, a lawyer who runs George Washington University’s paralegal program, doesn’t think paralegals will go the way of travel agents or VCR technicians.

“They’ll just do more stuff, right?” Marsh said. “So if it takes me two hours to do a brief instead of four, then I’ll just write two briefs in half a day instead of one.”

Paralegals perform a range of tasks for law firms, everything from interviewing witnesses to organizing documents for trials to putting together charts and graphs. Good writing and research skills are at the core of the job, which can often pay six figures.

Marsh is already tweaking her curriculum for next semester to incorporate AI. Maybe more group discussions about what makes a good legal argument, less actually writing that argument.

“Because now with ChatGPT, everybody turns in these perfect, five-paragraph essays that are structurally sound,” Marsh said. “So that’s something that I no longer need to teach them.”

While paralegals will likely be on the front lines of the AI revolution, some lawyers are also worried.

Attorney Stephen Wu at Silicon Valley Law Group was around for previous technological leaps like the personal computer and the internet. Those advances made lawyers far more efficient, but also added jobs in the legal profession.

But Wu said AI may play out differently.

“I think in this case, what we’re talking about is creating more jobs for companies that are vendors to law firms, rather than the law firms themselves,” he said.

And while Wu thinks AI will make some aspects of a lawyer’s job far easier, in the next few years certain parts are going to be tougher.

Like proving reality.

“You may be an opponent of evidence, where the other side is trying to introduce something that is an actual deepfake and trying to pass it off as something real,” Wu said. “And then you have the flip side, where you’re the proponent of the evidence, where the other side claims it’s a deepfake and it’s real.” 

AI is getting really, really good at generating convincing images, video and audio.

Let’s say you claim to have a recording of a certain high-profile public radio host confessing to a crime.

OK, so that doesn’t sound exactly like Kai Ryssdal. But you can whip it up in about half an hour on an AI voice-cloning service that costs $5 a month. And that tech is just getting better.

“Give it two months, give it three months,” said Hany Farid, an expert in digital forensics at the University of California, Berkeley.

Farid said he’s already being asked by law firms to help verify whether multimedia evidence is an AI-generated deepfake.

He does a lot of pro bono work. But for his paying clients, he’s not cheap.

“I charge $1,000 an hour. I cost what an expensive lawyer costs,” Farid said. “I can only take so many cases, and there’s only a handful of us who work at this level.”

So while artificial intelligence might save clients time and money on some legal work, it could make authenticating basic evidence much more expensive. Having something “on tape” isn’t as conclusive as it used to be.

Farid said this will be more of a temporary problem than a forever one. Watermarking both AI and real recordings will eventually solve much of it.

Until that happens, things may get messy in the judicial system.

Think that will hold up in court?

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.