sa国际传媒官网网页入口

NEWS

NM judges see AI mistakes creeping into legal cases

Courts have detected AI-generated hallucinations in at least seven NM lawsuits since 2023, leading to warnings and sanctions

Published

A 62-year-old sa国际传媒官网网页入口 man claiming a chronic lifelong disability contended he鈥檇 been retaliated and discriminated against by his former employer in 2023.

Among the damages sought in his pro se federal lawsuit: monetary sanctions in the amount of 鈥$355.69 quintillion ($355,687,428,096,000,000,000)鈥 鈥 a request that Senior U.S. District Judge Judith Herrera characterized as 鈥渜uite simply ludicrous.鈥

In the end, he himself had to pay $8,640 in sanctions after the judge determined other aspects of his legal filings were afflicted by a 21st-century phenomenon: AI hallucinations.

With the widespread and transformative use of  鈥 particularly  鈥 in daily life, federal and state courts in New Mexico are finding self-represented litigants and some attorneys filing legal cases containing false or misleading information.

For example, one attorney last year filed a pleading that cited at least six nonexistent cases to support his arguments.

鈥淭he six cases were fake and likely the handiwork of a ChatGPT or similar artificial intelligence (AI) program鈥檚 hallucinations,鈥 wrote U.S. Magistrate Judge Damian Mart铆nez of Las Cruces in a ruling in the case.

An out-of-state lawyer wrote the legal brief, which the New Mexico attorney didn鈥檛 read before filing.

Mart铆nez鈥檚 ruling states that a 鈥渉allucination occurs when an AI database generates fake sources of information. To explain how this occurs, AI models are trained on data, and they learn to make predictions by finding patterns in the data.鈥

鈥淗owever, the accuracy of these predictions often depends on the quality and completeness of the training data. If the training data is incomplete, biased, or otherwise flawed, the AI model may learn incorrect patterns, leading to inaccurate predictions or hallucinations,鈥 the judge wrote.

Mart铆nez fined the attorney $1,500, required him to report the incident to the state and federal bar disciplinary committee and ordered him to take an hourlong course in legal ethics on the use of AI in writing.

Courts in New Mexico have detected AI-generated hallucinations in at least seven lawsuits since 2023, sometimes imposing sanctions but more often issuing warnings.

鈥淎 lot of self-represented litigants, especially, are relying heavily on AI and they don鈥檛 know how to check these citations or the statutes, and so they鈥檙e filing a lot of pleadings that have a lot of earmarks of hallucinations,鈥 said state District Judge John P. Sugg of Carrizozo last week.

鈥淚鈥檒l read a motion that they filed, and I can鈥檛 find anything that they鈥檝e cited to. We鈥檝e had it with attorneys, too. I think that it鈥檚 concerning because we鈥檝e got very limited judicial resources, very limited time and when we鈥檙e chasing down a bunch of stuff that doesn鈥檛 actually exist, it wastes a lot of our time.鈥

Last summer, a federal judge in Colorado ordered two attorneys representing MyPillow CEO Mike Lindell in a defamation case to pay $3,000 each after they used artificial intelligence to prepare a court filing filled with a host of mistakes and citations of cases that didn't exist.

Christopher Kachouroff and Jennifer DeMaster violated court rules when they filed the document in February. It contained more than two dozen mistakes, including hallucinated cases, meaning fake cases made up by AI tools, according to National Public Radio.

In October 2023, then-Chief U.S. District Judge William Johnson of New Mexico discovered a case with opinions that were fake or nonexistent. He wrote that it appeared to be only the second time a federal court had dealt with a pleading involving nonexistent judicial opinion.

鈥淨uite obviously, many harms flow from such deception 鈥 including wasting the opposing party's time and money, the Court's time and resources, and reputational harms to the legal system (to name a few),鈥 Johnson wrote.

Sugg and other judges aren鈥檛 advocating against the use of AI, just as long as the information produced is accurate.

While the New Mexico Supreme Court is looking into creating a formal policy on AI use in the state judiciary, Sugg said he imposed his own order two weeks ago.

He is requiring any attorney or self-represented litigant who relies on generative AI to draft, edit or modify any pleading, motion or other written document filed with the court to disclose the use of AI at the top of the document.

Filers must also certify that the language drafted by AI was checked for accuracy using traditional methods, such as legal databases, 鈥渙r by a human being.鈥

鈥淚 think that AI is a good tool for a lot of people,鈥 Sugg said. 鈥淚t just needs to be something that we鈥檙e careful using.鈥