3.3 C
London
Saturday, November 22, 2025

High Court warns experts after solicitor insisted on using AI drafted report

Listen to this article:
0:00
0:00

High Court judge calls AI generated expert report a ‘gross breach of duty’ by solicitor

A High Court judge has warned against the misuse of artificial intelligence in expert evidence, describing a solicitor’s insistence on using an AI generated expert report as a “gross breach of duty.”

The warning came from Mr Justice Waksman, head of the Construction and Technology Court, speaking at the Bond Solon Expert Witness Conference last Friday. He was responding to findings in the latest Bond Solon Expert Witness Survey, produced in conjunction with the Law Society Gazette.

The survey revealed that an expert witness had been asked by a solicitor to accept instructions in a case where the solicitor insisted on providing them with a draft AI generated expert report.

Mr Justice Waksman said: “That to my mind is a gross breach of duty on the part of the solicitor.”

Embed from Getty Images


He expressed concern over another finding in the survey that 14 percent of expert witnesses said they would be willing to accept instructions on the same basis. “I cannot see how that can be appropriate conduct on the part of the expert, even if they’re doing it to avoid a row with the solicitor and they intend to dispose of the draft report soon afterwards,” he said.

Mr Justice Waksman described the report as timely, noting that judges had received updated guidance on the use of artificial intelligence the previous week.

The conference heard that judges are not prohibited from using AI tools and currently have access to a private version of ChatGPT 365 on their personal computers. The information and prompts entered into this model are not publicly accessible.

Mr Justice Waksman said: “If we wanted to use AI to summarise expert reports to introduce ourselves to expert issues in the case, there is no problem about leakage of information.”

He explained that judges do not have a duty to disclose their use of AI, as its use is considered comparable to relying on the assistance of a judicial clerk or researcher. However, he stressed: “The fact remains the ultimate product – the judgment – must be that of the judge alone.”

The senior judge warned against relying on artificial intelligence for legal research or analytical tasks, highlighting the risk of “hallucinations” where AI tools have produced false or fabricated legal citations in previous cases.

He also cautioned expert witnesses against delegating any part of their professional responsibility to AI systems. “Experts should steer clear of using AI to answer the questions that it is your job to answer. As soon as you do that, you compromise your independence,” he said.

The findings in the Bond Solon survey form part of wider discussions within the legal profession on the role of artificial intelligence in case preparation, expert testimony, and judicial decision making.

While courts are beginning to introduce secure AI tools to assist with administrative and preparatory work, legal bodies have continued to emphasise that human accountability remains essential for all professional judgments and legal outputs.

The updated guidance for judges, referred to by Mr Justice Waksman, reflects this principle. It makes clear that while AI may be used as an aid for managing information or summarising documents, all judicial reasoning and conclusions must remain the personal work of the judge.

The survey also recorded a growing appetite for formal regulation of expert witnesses in the United Kingdom, a topic which has become increasingly relevant as technological tools such as AI enter professional practice.

Mr Justice Waksman’s comments underscore the judiciary’s continuing concern over the use of generative AI in the legal process and its potential to undermine professional standards if misused.

Latest news
Related news