Tribunal refers barrister to BSB after citing a fictitious case generated by ChatGPT
A barrister has been referred to the Bar Standards Board (BSB) after citing a case generated by ChatGPT that did not exist in immigration tribunal proceedings.
Muhammad Mujeebur Rahman represented an appellant in an immigration matter when he included in grounds of appeal a reference to “Y (China) [2010] EWCA Civ 116”. He claimed the ruling supported arguments around delay. The tribunal later confirmed the case did not exist.
At a June 2025 hearing, Rahman was challenged on the reference. He initially suggested he had intended to cite established cases including YH (Iraq), R (WJ) v Secretary of State for the Home Department and Bensaid v United Kingdom. After a short break, he told judges he had conducted ChatGPT research during the lunch adjournment and maintained that “Y (China)” was genuine. He insisted it was decided by Lords Justices Pill and Sullivan alongside Sir Paul Kennedy.
The tribunal gave him a deadline either to produce a copy of the judgment or to explain the error. While the panel proceeded with the next case, Rahman passed a nine-page internet printout to the tribunal clerk. The document contained “misleading statements”, including repeated references to the fictitious “Y (China)” case under the correct citation for YH (Iraq). Notably, it made no reference to the principal case law on delay.
Embed from Getty ImagesBefore the deadline expired, Rahman wrote to the tribunal stating he had meant to cite YH (Iraq) and apologised for not supplying the full and correct case name. He explained that he had been suffering from “acute illness” while drafting the appeal and referred to a hospitalisation in Bangladesh with diabetes, cholesterol issues and high blood pressure. He also stressed his family responsibilities as the father of four children.
At a further hearing, Rahman accepted that he had relied on ChatGPT both to draft the original appeal and to create the document handed up in court. He argued, however, that he had been “misled by the search engine and is thus also a victim”.
The tribunal rejected this explanation. It found Rahman had not checked the authority against reputable legal databases such as Westlaw, LexisNexis, Bailii or EIN. His letter to the panel, it said, was “a less than honest attempt to pretend” he had made only a typographical mistake, rather than relying on an AI tool to fabricate supporting material.
Although the tribunal stopped short of finding deliberate fraud, it ruled that Rahman had failed to act with honesty and integrity. It also concluded that reliance on the fictitious case may have influenced the decision to grant permission on one of the appeal grounds.
The panel warned that lawyers have a professional duty to verify legal authorities and that “taking unprofessional short-cuts which will very likely mislead the Tribunal is never excusable”.
Rahman has since undertaken further training in immigration law and in the proper use of AI technology. He apologised for his conduct and argued he should not face a regulatory referral, insisting he has developed a proper understanding of his professional duties and intends to act with integrity in future.
Despite those assurances, the tribunal referred the matter to the BSB for consideration of potential misconduct.