High Court judge condemns barrister’s negligence after fictitious cases found in court papers
A High Court judge has issued a damning ruling after a junior barrister included multiple fictitious cases in court submissions, triggering serious concerns about negligence and the use of artificial intelligence (AI) in legal practice.
The case, concerning a homeless claimant seeking accommodation from Haringey Council, took an unexpected turn when the defendant discovered five fabricated legal authorities in the claimant’s legal documents. Mr Justice Ritchie, presiding over the case, was unequivocal in his condemnation of the barrister’s conduct.
“These were not cosmetic errors; they were substantive fakes,” the judge remarked, stressing that no proper explanation had been given for the inclusion of such fictitious cases. He expressed disbelief over how such serious lapses occurred, stating: “I have a substantial difficulty with members of the Bar who put fake cases in statements of facts and grounds.”
Although the judge stopped short of confirming whether AI was responsible for the inclusion of fake cases, he suggested that it would have been negligent for the barrister, Sarah Forey, if AI had been used without proper verification. “On the balance of probabilities, I consider that it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading,” said Justice Ritchie.
Embed from Getty ImagesThe judge further noted that Forey, from 3 Bolt Court Chambers, had acted “improperly, unreasonably, and negligently.” He found that Forey had intentionally included the non-existent cases in her legal filings, without concern for their authenticity, and likely sourced them from a questionable or unreliable platform. The judge rejected the notion that Forey could have “photocopied” non-existent cases, highlighting the severity of the error.
In response to the breach of professional conduct, the judge ordered both Forey and the solicitors from Haringey Law Centre to personally pay £2,000 each towards the council’s legal costs.
The case has sparked a wider discussion on social media and among legal professionals, particularly in relation to the increasing role of AI in the legal industry. Adam Wagner KC, a prominent barrister at Doughty Street Chambers, commented on LinkedIn, noting that while the court did not confirm AI as the cause, it seemed a “very reasonable possibility.” Wagner stressed the importance of using AI cautiously: “AI can be a time saver, but the key lesson is that AI should only ever be the starting point of a research or drafting task.”
The controversy also highlights the growing concerns within the legal community about the potential pitfalls of AI-generated content. Judges recently received updated guidance on identifying AI-driven submissions, particularly in light of growing cases where AI may “hallucinate” or create non-existent legal precedents.
This ruling underlines the critical responsibility of lawyers to ensure the accuracy and integrity of their submissions, whether AI is used or not. It also marks a cautionary tale for the profession, warning that negligence in this area can have serious repercussions.
With the legal sector increasingly relying on technology, this case serves as a stark reminder that human oversight remains essential in ensuring the credibility and accuracy of legal work. Whether AI-generated or not, the inclusion of false information in legal pleadings can seriously undermine the trust and integrity of the justice system.