• info@steminsights.org

Anthropic’s lawyer was forced to apologize after Claude hallucinated a legal citation

​A lawyer representing Anthropic admitted to using an erroneous citation created by the company’s Claude AI chatbot in its ongoing legal battle with music publishers, according to a filing made in a Northern California court on Thursday. Claude hallucinated the citation with “an inaccurate title and inaccurate authors,” Anthropic says in the filing, first reported […]  

Leave a Reply

Your email address will not be published. Required fields are marked *