Judge scolds taxpayer who relied on faulty AI to fight tax case

Judge scolds taxpayer who relied on faulty AI to fight tax case

An English judge has criticised a taxpayer for using an artificial intelligence chatbot to argue his case in court after it “hallucinated” non-existent legal precedents.

Marc Gunnarsson used AI to help prepare his legal submissions when appealing a decision by HM Revenue and Customs (HMRC). The tax authority was pursuing him for £12,918 in self-employment support payments he had claimed during the pandemic.

HMRC officials noticed three fictitious tribunal decisions cited in Mr Gunnarsson’s skeleton argument, submitted the day before the hearing at the Upper Tribunal.

Judge Rupert Jones issued a stark warning about the technology’s unreliability in legal settings. “The accuracy of AI should not be relied upon without checking,” he said. “There is a danger that unarguable submissions or inaccurate or even fictitious information or references may be generated.”

Mr Gunnarsson, who represented himself, had initially won his case at the First Tier Tribunal, arguing it was his honest belief that he was self-employed. However, HMRC appealed, and the Upper Tribunal found in its favour, The Telegraph reports.

Judge Jones acknowledged that Mr Gunnarsson was not legally trained and “may not have understood that the information and submissions presented were not simply unreliable but fictitious”. While he was not deemed “highly culpable”, the judge warned that “in the appropriate case, the Upper Tribunal may take such matters very seriously”.

The incident highlights a growing trend of litigants using AI chatbots, which can invent false data and present it as fact — a phenomenon known as “hallucination”.

Share icon
Share this article: