The Land of Make-Believe
Over two years ago, I wrote about Peter LoDuca and Steven A. Schwartz, attorneys with Levidow, Levidow & Oberman, P.C., a personal injury firm in New York City. In a lawsuit they filed, Schwartz used ChatGPT to help draft a brief.
After he found several cases that were right on point and supported his client’s position, Schwartz then asked ChatGPT, “what is your source?” The program indicated the cases could be found in online researchers Westlaw and LexisNexis.
Shortly after filing the brief, however, attorneys for the other side asserted that the cases Schwartz cited were made up. In an affidavit, Schwartz admitted that he had used OpenAI’s chatbot for his research. He indicated that he was “unaware of the possibility that its content could be false.” He now “greatly regrets having utilized generative artificial intelligence to supplement the legal research performed herein and will never do so in the future without absolute verification of its authenticity.”
Since then, the use of artificial intelligence in the legal profession has exploded, particularly among younger attorneys. It is still not perfect, however.
Judges in two separate federal courts had to recently undo their rulings after lawyers revealed that filings they submitted contained inaccurate case details or misquoted cited cases. The filings reflected typical artificial intelligence-style inaccuracies, including the use of “ghost” or “hallucinated” quotes in incorrect or even nonexistent cases.
In New Jersey, U.S. District Judge Julien Neals had to withdraw his denial of a motion to dismiss a securities fraud case after lawyers revealed his decision relied on filings with “pervasive and material inaccuracies.” The filing pointed to “numerous instances” of made-up quotes submitted by attorneys, as well as three separate instances when the outcome of lawsuits appeared to have been mistaken, prompting Neals to withdraw his decision.
Meanwhile, in Mississippi, U.S. District Judge Henry Wingate replaced his original July 20th temporary restraining order that paused enforcement of a state law blocking diversity, equity and inclusion programs in public schools after lawyers notified the judge of serious errors submitted by the attorney.
They informed Judge Wingate that his decision “relie[d] upon the purported declaration testimony of four individuals whose declarations do not appear in the record for this case.” The court subsequently issued a new ruling, though lawyers for the state have asked his original order to be placed back on the docket.
“All parties are entitled to a complete and accurate record of all papers filed and orders entered in this action, for the benefit of the Fifth Circuit’s appellate review,” the state attorney general said in a filing. The erroneous filing was prepared using AI, which the court indicated had “never seen anything like this” before.
The lawyers involved will most certainly face repercussions for the erroneous filings, which are just the latest in a string of AI-related inaccuracies. In May, U.S. District Judge Michael Wilner from California assessed the K&L Gates and Ellis George law firms with $31,000 in sanctions for using AI in court filings, saying that “no reasonably competent attorney should out-source research and writing to this technology, particularly without any attempt to verify the accuracy of that material.”
Likewise, U.S. District Judge Anna Manasco from Alabama sanctioned three attorneys for submitting erroneous court filings that were later revealed to have been generated by ChatGPT and referred them to the state bar for further disciplinary proceedings. “Fabricating legal authority is serious misconduct that demands a serious sanction,” she said.
So, AI is like fire – it can be a great tool, but if you aren’t careful, you can get burned.
Reg P. Wydeven
Latest posts by Reg P. Wydeven (see all)
- Investigative Police Work - February 10, 2026
- Not Forking Around - February 3, 2026
- Gnarly Lawsuit - January 27, 2026
- The Land of Make-Believe - January 20, 2026
- Labubu Boo-Boo - January 13, 2026