13.9 C
London
Wednesday, September 17, 2025

Australia’s first lawyer sanctioned over AI false case list in court

Victorian lawyer stripped of principal status after filing AI-fabricated case citations

A Victorian lawyer has become the first in Australia to be formally sanctioned for using artificial intelligence in court after submitting a list of fabricated case citations generated by software.

The solicitor, whose name remains suppressed, has been stripped of his ability to practise as a principal lawyer. He must now work only under supervision for the next two years, reporting regularly to the Victorian Legal Services Board (VLSB).

The incident unfolded during a hearing on 19 July 2024 before Justice Amanda Humphreys. The solicitor had been representing a husband in a marital dispute and was asked to provide a list of relevant prior cases. Instead, the court was handed a document containing cases that simply did not exist.

On returning to chambers, Humphreys and her associates discovered that none of the citations could be verified. When the matter resumed, the lawyer admitted the list had been prepared using AI-powered legal research software. He confessed that he had failed to check the accuracy of the results before filing them in court.

Embed from Getty Images

The solicitor offered what he described as an “unconditional apology” and covered the opposing side’s costs for the wasted hearing. He told the court he had not fully understood how the software worked, and accepted the need to personally verify all AI-assisted research in future.

Humphreys acknowledged the apology and the stress caused, but said the matter had to be referred to regulators. She noted the growing use of AI tools in legal practice made it a matter of public interest.

The VLSB confirmed on 19 August that his practising certificate had been varied. He is no longer authorised to act as a principal, to handle trust money, or to operate his own firm. Instead, he may continue practising only as an employee solicitor under supervision. His supervisor must file quarterly reports with the board for the duration of the two-year restriction.

A spokesperson for the board said: “The board’s regulatory action in this matter demonstrates our commitment to ensuring legal practitioners who choose to use AI in their legal practice do so in a responsible way that is consistent with their obligations.”

This is the first confirmed case of its kind in Australia. However, courts across the country have since encountered more than 20 instances where lawyers or self-represented litigants have filed documents containing bogus citations created by generative AI tools.

Regulators in Western Australia and New South Wales have also referred lawyers for investigation over similar conduct. In one unusual case, a party attempted to attribute a document to ChatGPT, but the court found it had been created before the tool was even publicly available.

The Law Council of Australia has warned lawyers that AI must never substitute professional judgment. Its president, Juliana Warner, described the submission of non-existent cases as a “serious concern”.

“Where these tools are utilised by lawyers, this must be done with extreme care,” Warner told Guardian Australia. She added that an outright ban on AI in legal practice would be “neither practical nor proportionate, and risks hindering innovation and access to justice.”

Courts have signalled that while AI will inevitably shape legal processes, practitioners remain personally responsible for ensuring accuracy and integrity. The Victorian case has served as a sharp reminder that reliance on generative software without verification can cost lawyers their professional standing.

Latest news
Related news