Judge scolds taxpayer who used AI to fight HMRC after it cited non-existent cases

A judge has criticised a taxpayer for using an artificial intelligence chatbot to argue his case in court after it “hallucinated” non-existent legal precedents.
Marc Gunnarsson used AI to help prepare his legal submissions when appealing a decision by HM Revenue and Customs (HMRC). The tax authority was pursuing him for £12,918 in self-employment support payments he had claimed during the pandemic.
HMRC officials noticed three fictitious tribunal decisions cited in Mr Gunnarsson’s skeleton argument, submitted the day before the hearing at the Upper Tribunal.
Judge Rupert Jones issued a stark warning about the technology’s unreliability in legal settings. “The accuracy of AI should not be relied upon without checking,” he said. “There is a danger that unarguable submissions or inaccurate or even fictitious information or references may be generated”.
Mr Gunnarsson, who represented himself, had initially won his case at the First Tier Tribunal, arguing it was his honest belief that he was self-employed. However, HMRC appealed, and the Upper Tribunal found in its favour, The Telegraph reports.
Judge Jones acknowledged that Mr Gunnarsson was not legally trained and “may not have understood that the information and submissions presented were not simply unreliable but fictitious”. While he was not deemed “highly culpable”, the judge warned that “in the appropriate case, the Upper Tribunal may take such matters very seriously”.
The incident highlights a growing trend of litigants using AI chatbots, which can invent false data and present it as fact – a phenomenon known as “hallucination”.
This issue is not isolated. A junior barrister, Sarah Forey, was recently ordered to pay wasted costs after citing fictitious cases at the High Court. While the court did not confirm AI was used, the judge described the submission of fake cases as “improper”, “unreasonable” and “negligent”.
In another case this year, a father appealing a High Income Child Benefit Charge had his appeal dismissed after using AI to construct his defence. The judge noted that the tool produced irrelevant arguments and that the case “highlights the dangers of reliance on AI tools without human checks”.