Sources

Legal trend / database

  1. Damien Charlotin’s AI Hallucination Cases Database damiencharlotin.com
  2. Washington Post: “Lawyers using AI keep citing fake cases…” washingtonpost.com
  3. Business Insider: “AI hallucinations in court documents are a growing problem…” businessinsider.com Sanctions 

  4. ABA Journal: “Special master imposes $31,100 sanction…” abajournal.com
  5. Reuters: “Trouble with AI hallucinations spreads to big law firms” reuters.com
  6. Mata v. Avianca order (PDF) gibbonslawalert.com

Consumer liability (Air Canada) 


  1. CanLII official decision: Moffatt v. Air Canada, 2024 BCCRT 149 canlii.ca
  2. The Guardian coverage theguardian.com

“Concise increases hallucinations” 


  1. Giskard analysis giskard.ai
  2. TechCrunch report techcrunch.com Search / AI Overviews 

  3. Washington Post: Google scales back AI answers after errors washingtonpost.com
  4. The Verge: “now it’s telling us to put glue on our pizza” theverge.com
  5. Google blog response blog.google

Healthcare 


  1. JAMA Network Open: Oncology chatbot accuracy study jamanetwork.com
  2. JAMA Network Open: Pathology report simplification and errors jamanetwork.com
  3. The Verge: Med-Gemini “basilar ganglia” error theverge.com

Stanford legal studies (error rates & RAG) 


  1. Stanford HAI: “AI on Trial… 1 out of 6 (or more)” hai.stanford.edu
  2. Paper: “Hallucination-Free? Assessing the Reliability of Leading AI Legal Research Tools” arxiv.org
  3. Stanford Law: “Hallucinating Law” law.stanford.edu