Court Issues Landmark Warning over AI Hallucinations in Legal Practice

Court Issues Landmark Warning over AI Hallucinations in Legal Practice

A high court ruling has sent a powerful message to the legal profession after solicitors and a barrister were penalised for submitting fictitious case law, suspected to have originated from unverified AI tools. 

In the recent case of Frederick Ayinde, R (on the application of) v The London Borough of Haringey [2025] EWHC 1040 (Admin), the High Court considered a judicial review brought by a homeless applicant challenging the London Borough of Haringey’s handling of their housing assistance application.  

While the council’s procedural failures, such as missing key filing deadlines, resulted in a favourable outcome for the claimant, the proceedings took an extraordinary turn when the council applied for wasted costs against the claimant’s legal team. This was based on the discovery that the claimant’s submissions contained five fabricated legal case citations and a serious misrepresentation regarding statutory duties under the Housing Act 1996. 

In his ruling, Mr Justice Ritchie found that the conduct of the claimant’s legal representatives to be improper, unreasonable and negligent, ordering them to personally pay a portion of the council’s legal costs.  

Did AI have a hand in this? 

Though no conclusive finding was made regarding the use of AI in the drafting of pleadings, counsel for Haringey argued that the fake cases likely stemmed from AI-generated content. Mr Justice Ritchie chose not to make specific ruling on this point, but acknowledged that the cases were undeniably fake. 

The judge expressed grave concern at the inclusion of these fabricated cases, commenting that the submission could have stood on its own merits without resorting to fake cases. In a particularly striking observation, he commented: 

“It is such a professional shame. The submission was a good one… Why put a fake case in?” 

A Warning Shot for Legal Professionals 

The judgement delivers a stark warning to legal professionals about the risks of unverified reliance on AI tools – particularly those prone to “hallucinations”, where generative AI systems fabricate plausible-sounding information that is not entirely accurate or is simply untrue.  

The judge was firm in stating that the responsibility to verify legal authorities lies jointly with both barristers and solicitors, rejecting any suggestion that one side could rely on the other for accuracy. He also stressed that professionals should self-report such errors to regulatory bodies rather than downplay them as insignificant. 

You might also like

Leadership & Wellbeing Conference
read more
New Sexual Harassment Legislation Demands Proactive Steps From Businesses More

Barrister Sues Cambridge University after Failing His PhD
read more
Barrister Sues Cambridge University after Failing His PhD More

Shutterstock 1689338029 1
read more
Bar Council Calls for Consistency and Investment in Remote Hearings More

The Bar Council has issued a call for greater consistency in remote hearings and increased investment in court technology based on findings from a new report on the administration and delivery of remote justice.