0

Court Issues Landmark Warning over AI Hallucinations in Legal Practice

Court Issues Landmark Warning over AI Hallucinations in Legal Practice

A high court ruling has sent a powerful message to the legal profession after solicitors and a barrister were penalised for submitting fictitious case law, suspected to have originated from unverified AI tools. 

In the recent case of Frederick Ayinde, R (on the application of) v The London Borough of Haringey [2025] EWHC 1040 (Admin), the High Court considered a judicial review brought by a homeless applicant challenging the London Borough of Haringey’s handling of their housing assistance application.  

While the council’s procedural failures, such as missing key filing deadlines, resulted in a favourable outcome for the claimant, the proceedings took an extraordinary turn when the council applied for wasted costs against the claimant’s legal team. This was based on the discovery that the claimant’s submissions contained five fabricated legal case citations and a serious misrepresentation regarding statutory duties under the Housing Act 1996. 

In his ruling, Mr Justice Ritchie found that the conduct of the claimant’s legal representatives to be improper, unreasonable and negligent, ordering them to personally pay a portion of the council’s legal costs.  

Did AI have a hand in this? 

Though no conclusive finding was made regarding the use of AI in the drafting of pleadings, counsel for Haringey argued that the fake cases likely stemmed from AI-generated content. Mr Justice Ritchie chose not to make specific ruling on this point, but acknowledged that the cases were undeniably fake. 

The judge expressed grave concern at the inclusion of these fabricated cases, commenting that the submission could have stood on its own merits without resorting to fake cases. In a particularly striking observation, he commented: 

“It is such a professional shame. The submission was a good one… Why put a fake case in?” 

A Warning Shot for Legal Professionals 

The judgement delivers a stark warning to legal professionals about the risks of unverified reliance on AI tools – particularly those prone to “hallucinations”, where generative AI systems fabricate plausible-sounding information that is not entirely accurate or is simply untrue.  

The judge was firm in stating that the responsibility to verify legal authorities lies jointly with both barristers and solicitors, rejecting any suggestion that one side could rely on the other for accuracy. He also stressed that professionals should self-report such errors to regulatory bodies rather than downplay them as insignificant. 

You might also like

Nahrizul kadri O As F0 QMR Wl A unsplash
read more
SRA Authorises UK's First AI-powered Law Firm More

Nathan dumlao k Lmt1mp GJ Vg unsplash
read more
Judges Warned No Expressing Views on Social Media More

Pexels august de richelieu 4427819
read more
The Art of Strategic Networking in Legal Marketing More

Networking is essential in the legal industry, and it is an art that requires skill, patience, and dedication. Effective networking can help...