See every side of every news story
Published loading...Updated

Anthopic's Law Firm Blames Claude Hallucinations for Errors

  • Anthropic's law firm apologized on May 15, 2025, for citation errors in an expert report filed in a Northern California copyright lawsuit involving music publishers.
  • The errors arose after using Anthropic's AI tool, Claude, which hallucinated an inaccurate article title and incorrect authors in the legal citation despite the article itself existing.
  • Attorney Ivana Dukanovic of Latham & Watkins stated the mistake was an honest citation error not caught by manual checks and emphasized the article supports expert Olivia Chen's testimony on margin-of-error standards.
  • The court instructed Anthropic to submit a statement explaining the issue by May 15, while a lawyer representing Anthropic described the error as a regrettable and inadvertent citation mistake caused by Claude’s hallucinations.
  • This case highlights judicial concern over AI hallucinations in legal filings and suggests growing pressure on lawyers to verify AI-generated content to avoid disciplinary action.
Insights by Ground AI
Does this summary seem wrong?

25 Articles

All
Left
1
Center
2
Right
2
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 40% of the sources are Center, 40% of the sources lean Right
40% Right
Factuality

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

Music Business Worldwide broke the news in on Tuesday, May 13, 2025.
Sources are mostly out of (0)