Thomas Claburn, The Register; Anthropic’s law firm throws Claude under the bus over citation errors in court filing
"An attorney defending AI firm Anthropic in a copyright case brought by music publishers apologized to the court on Thursday for citation errors that slipped into a filing after using the biz's own AI tool, Claude, to format references.
The incident reinforces what's becoming a pattern in legal tech: while AI models can be fine-tuned, people keep failing to verify the chatbot's output, despite the consequences.
The flawed citations, or "hallucinations," appeared in an April 30, 2025 declaration [PDF] from Anthropic data scientist Olivia Chen in a copyright lawsuit music publishers filed in October 2023.
But Chen was not responsible for introducing the errors, which appeared in footnotes 2 and 3.
Ivana Dukanovic, an attorney with Latham & Watkins, the firm defending Anthropic, stated that after a colleague located a supporting source for Chen's testimony via Google search, she used Anthropic's Claude model to generate a formatted legal citation. Chen and defense lawyers failed to catch the errors in subsequent proofreading.
"After the Latham & Watkins team identified the source as potential additional support for Ms. Chen’s testimony, I asked Claude.ai to provide a properly formatted legal citation for that source using the link to the correct article," explained Dukanovic in her May 15, 2025 declaration [PDF].
"Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors.
"Our manual citation check did not catch that error. Our citation check also missed additional wording errors introduced in the citations during the formatting process using Claude.ai."...
The hallucinations of AI models keep showing up in court filings.
Last week, in a plaintiff's claim against insurance firm State Farm (Jacquelyn Jackie Lacey v. State Farm General Insurance Company et al), former Judge Michael R. Wilner, the Special Master appointed to handle the dispute, sanctioned [PDF] the plaintiff's attorneys for misleading him with AI-generated text. He directed the plaintiff's legal team to pay more than $30,000 in court costs that they wouldn't have otherwise had to bear.
After reviewing a supplemental brief filed by the plaintiffs, Wilner found that "approximately nine of the 27 legal citations in the ten-page brief were incorrect in some way."
Two of the citations, he said, do not exist, and several cited phony judicial opinions."