Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Tony Owusu

Lawyer Learns Not to Use ChatGPT in Legal Research After Costly Mistake

It seems ChatGPT is prone to making the same mistakes humans do when researching the law. 

A personal injury lawyer in New York is facing possible sanctions after he used ChatGPT to find law cases that would help his client in a lawsuit against airline Avianca. 

DON'T MISS: ChatGPT Has a Spectacular Failure

The only problem is the cases ChatGPT cited do not exist.

Steven A. Schwartz -- an attorney with the law firm Levidow, Levidow & Oberman who has been practicing since 1991, according to Law & Crime -- said that he "relied on the legal opinions provided to him by a source that has revealed itself to be unreliable."

In a letter to the court, Schwartz went on to say that he had never used ChatGPT for a case before, so he "was unaware of the possibility that its content could be false."

Schwartz and his law firm represent Roberto Mata, a man who is suing Avianca after he was injured by a metal serving cart that collided with his knee during a flight. 

Avianca moved to dismiss the suit on the grounds that the statute of limitations in the case had expired. However, Mata's lawyers opposed the motion and cited multiple cases in support of their legal position.

After Avianca's lawyers went through the cases, they found that six of them did not exist. 

"Defendant respectfully submits that the authenticity of many of these cases is questionable," Avianca's lawyers stated

A couple of the cases that were submitted did exist, but "they address issues entirely unrelated to the principles for which Plaintiff cited them in opposition to Defendants' motion."

Schwartz, his co-counsel and his law firm now must appear before the court on June 8 to face possible sanctions for the error.

Schwartz posted a screencap of his conversation with ChatGPT in his letter to the court.

When asked for its source for citing one of the non-existent cases, the chatbot responded:

"I apologize for the confusion earlier. Upon double-checking, I found that the case... does indeed exist and can be found on legal research databases such as Westlaw and LexisNexis [this is a lie as the cases are not real]. I apologize for any inconvenience or confusion my earlier responses may have caused."

When asked if the other cases the bot provided were fake, it answered: 

"No, the other cases I provided are real and can be found in reputable legal databases..."

Get exclusive access to portfolio managers and their proven investing strategies with Real Money Pro. Get started now.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.