Ayinde v London Borough of Haringey and Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin)

Solicitors’ Negligence and AI: Lessons from Ayinde v London Borough of Haringey & Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin)

This article analyses the High Court judgment in the joined cases of Ayinde v London Borough of Haringey and Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin), where serious issues arose concerning the misuse of generative artificial intelligence (AI) in legal documentation.

For expert legal assistance regarding Professional Negligence or Solicitors Negligence, please contact Carruthers Law on 0151 541 2040 or email us at info@carruthers-law.co.uk.

On 6 June 2025, the High Court handed down judgment. The two cases were heard together under the court’s “Hamid” jurisdiction, its inherent power to regulate proceedings and uphold lawyers’ duties to the court, after serious concerns arose that false legal references and information had been submitted in court documents.

In each case, the misuse or suspected misuse of generative artificial intelligence (AI) tools led to fictitious case law, fake citations, and misstatements of law being put before the court. The Divisional Court (presided over by Dame Victoria Sharp, P, and Mr Justice Johnson) was asked to decide whether to initiate contempt of court proceedings in light of this misconduct. In its judgment, the Court not only addressed the specific circumstances of the Ayinde and Al-Haroun cases, but also made broader observations about the acceptable use of AI in legal practice and set out baseline expectations for practitioners.

Background to the Ayinde Case

The Ayinde case was a judicial review claim concerning the provision of interim accommodation to a homeless individual. The claimant, Mr. Frederick Ayinde, had applied for urgent housing assistance from the London Borough of Haringey, and he challenged the council’s decision through judicial review. He was represented by Haringey Law Centre (a legal charity), with solicitor Mr. Victor Amadigwe (the Law Centre’s Chief Executive) and paralegal Ms. Sunnelah Hussain handling the case, and counsel Ms. Sarah Forey instructed to draft and argue the claim. Ms. Forey was at the time a pupil barrister (in the first year of practice).

In March 2025, Ms. Forey prepared the claimant’s grounds of claim in the judicial review. These written arguments contained references to at least five legal authorities which were supposed to support Mr. Ayinde’s position. It later emerged that none of those cases actually existed. For example, one cited case, El Gendi [2020] EWHC 2435 (Admin), a case that does not exist under that name; the citation [2020] EWHC 2435 does exist, but relates to an unrelated case; tellingly, the neutral citation [2020] EWHC 2435 does belong to a different, unrelated case. In addition, the submission seriously misrepresented a statute, incorrectly stating the effect of section 188(3) of the Housing Act 1996 (the provision relating to interim housing for the homeless). Upon further scrutiny, the Court found that four more case citations in the document were entirely fictitious as well. The content was accompanied by subtle red flags, the text featured Americanised spelling (for instance, “emphasized” with a z) a red flag for AI. These features ultimately raised the Court’s suspicion that an AI-based tool may have been used to generate the legal arguments.

Findings in the Ayinde Case

Before the issues came to court, the defendant’s legal team (representing Haringey Council) alerted the claimant’s lawyers that the authorities cited in Ms. Forey’s grounds appeared to be invalid or non-existent. Rather than immediately concede and investigate the errors, the initial reply from the claimant’s side was dismissive. In a letter sent to the defendant’s solicitors, the claimant’s team suggested the citation problems were “easily explained” errors and offered only to correct them on the record, without further explanation. The letter even chided the defendant’s lawyers, suggesting they “may better serve your organisation” by focusing on finding supporting case law for their own arguments instead of highlighting the claimant’s citation issues. This response, which the High Court later called unsatisfactory, indicated that the gravity of the situation was not initially appreciated by those representing Mr. Ayinde.

When the matter was referred to the High Court, a special hearing was convened on 23 May 2025 to examine what had gone wrong. Ms. Forey denied using any AI tool to draft the grounds, but the Court found her explanations unpersuasive. She claimed that during her pupillage she received little supervision or access to legal research resources, and that she had been handling a busy caseload on her own. The judges noted, however, that even a junior lawyer in her position had basic means to verify authorities, for example by using the free National Archives case law database or an Inn of Court library, and they remarked that a lack of paid databases or formal supervision was “marginal mitigation” at best.

In its decision, the Divisional Court concluded that Ms. Forey’s conduct had indeed resulted in misleading the court, meeting the threshold of contempt (i.e. conduct liable to be punished as contempt of court). However, the judges exercised discretion and decided not to commence contempt proceedings against her. They noted her lack of seniority and the unresolved questions about how she had been supervised and trained. Instead, the Court referred Ms. Forey to the Bar Standards Board (BSB), the regulator of barristers, for further investigation and appropriate action. The referral to the BSB specifically invites inquiry into not only Ms. Forey’s own conduct but also the role of her chambers in training and supervising a pupil barrister who ended up submitting such flawed work. The Court pointed out that issues like the adequacy of her pupillage supervision could not be dealt with in a simple contempt hearing focused only on the individual.

As for Haringey Law Centre’s involvement, the Court did not find evidence of deliberate wrongdoing by Mr. Amadigwe, the supervising solicitor. It appeared that neither Mr. Amadigwe nor Ms. Hussain (the paralegal) had any inkling that the counsel they instructed would cite non-existent cases, in fact, the Law Centre admitted it had never occurred to them that a barrister might do so. The Court accepted that explanation and decided not to pursue contempt proceedings against Mr. Amadigwe. Nevertheless, it viewed the situation as a serious failure of oversight. The judges referred Mr. Amadigwe to the Solicitors Regulation Authority (SRA) so that the SRA could investigate the firm’s practices, for example, looking into how he responded when the false citations were first flagged by the opponent, and whether he ensured that Ms. Forey was competent and properly supervised for the task at hand. The Court’s message was that even under pressure, a solicitor overseeing a case must take active steps to verify and not simply rely on counsel blindly. Ms. Hussain (initially described in some filings as a solicitor but in fact a paralegal) was exonerated of any fault, the Court made clear that she had merely been passing along instructions and, in its words, was “not at fault in any way”. In examining the Haringey Law Centre’s oversight, the Court identified significant supervisory shortcomings, reflecting key issues addressed within our detailed analysis on solicitors’ negligence.

The judgment also acknowledged the context in which Haringey Law Centre operates. As a small legal charity, the Law Centre was overstretched and under-resourced, providing vital services to vulnerable clients. The Court accepted that resource limitations were genuine, but remarked that such constraints make it “all the more important that professional standards are maintained”. In other words, a lack of funding or manpower is not a licence to cut corners in legal work. The Law Centre was reminded that it must only instruct and rely on those who adhere to proper professional standards, especially when serving vulnerable people.

Background to the Al-Haroun Case

The Al-Haroun case, although joined in this judgment due to similar concerns about false information, arose in a very different context. Mr. Hamad Al-Haroun’s claim was a civil action for damages, he sought approximately £89.4 million in compensation, alleging breaches of a complex financing agreement by Qatar National Bank (QPSC) and its subsidiary, QNB Capital LLC. At an earlier stage of that litigation, an order was made (unfavourable to Mr. Al-Haroun), and he applied to set aside that order so that his claim could proceed. In support of this application, witness statements were prepared and filed in early 2024 by Mr. Al-Haroun himself and by his solicitor, Mr. Abid Hussain of Primus Solicitors. These statements were intended to persuade the court to reinstate the claim, and they cited a large number of legal authorities to bolster the argument.

Upon examination, the court discovered that many of the referenced cases and quotations in the Al-Haroun statements were bogus. In the words of the judgment, the statements contained “numerous authorities, many of which appear to be either completely fictitious or which, if they exist at all, do not contain the passages supposedly quoted from them, or do not support the propositions for which they are cited.”. In short, several cited case law authorities were non-existent, and others were real cases but were misquoted or mischaracterised.

Confronted with this revelation, Mr. Al-Haroun provided an explanation. He acknowledged responsibility for the inclusion of the false material and admitted how it happened: he had used online tools, including generative AI, to do legal research on his own. According to his account, he employed publicly available AI services and other legal search websites, then took the output and believed it to be accurate. He had “complete (but, he accepts, misplaced) confidence” in the authenticity of what he found online. In other words, Mr. Al-Haroun, a lay person, unwittingly inserted AI-generated falsehoods into his court submissions, genuinely thinking they were valid legal precedents.

More surprising was the role of Mr. Hussain, the solicitor. Mr. Hussain admitted that he incorporated Mr. Al-Haroun’s research into the legal filings without independently verifying the authorities. Essentially, the client (Mr. Al-Haroun) had taken the lead on research via Google and ChatGPT, and the solicitor trusted this research and copied the string of citations into the witness statement supporting the application. The Court found it astounding that a solicitor would defer to a client in this way on a matter of legal research. As the judgment later highlighted, it is supposed to be the lawyer who vets and guides the client, not the client guiding the lawyer on what the law says.

Findings in the Al-Haroun Case

The judges were “scathing” about the solicitors’ conduct in this case. They described it as “extraordinary that the lawyer was relying on the client for the accuracy of their legal research, rather than the other way around.” In failing to check the authorities, Mr. Hussain had shown a “lamentable failure to comply with the basic requirement to check the accuracy of material that is put before the court.”

Importantly, the Court accepted that Mr. Al-Haroun did not act with an intention to deceive the court. His mistake was one of naivety, he overly trusted the outputs of an AI tool and other online resources without the legal expertise to discern truth from fabrication. The judges noted that Mr. Al-Haroun had misplaced trust in these sources but did not set out to mislead. As a self-represented litigant on the research front, he lacked the legal knowledge to verify what he found, which is precisely why the duty falls on the lawyer to do so.

As with Ayinde, the Divisional Court decided not to commence contempt proceedings over the false material in Al-Haroun. Mr. Hussain and his firm (Primus Solicitors) were referred to the SRA for investigation of their conduct. The SRA will consider whether they failed to meet professional standards, for instance, by not having proper safeguards to prevent the filing of unsupported legal claims and not verifying the content drafted with client input. Any disciplinary consequences (such as fines or practicing restrictions) will depend on the outcome of that regulatory investigation, which was beyond the scope of the court’s judgment. Mr. Al-Haroun, for his part, was not subjected to any sanction by the court aside from the likely setback this incident caused to his lawsuit. The Court viewed him as a layperson who fell victim to unreliable technology and poor legal advice, rather than a malicious actor.

Court’s Broader Observations on AI Use

While addressing these two cases, the High Court took the opportunity to articulate broader principles regarding the use of AI in legal proceedings. The judges acknowledged that artificial intelligence tools, especially large language models capable of generating text, are increasingly accessible and used in various fields, including law. However, the judgment emphasised a clear warning: AI-generated content cannot be taken at face value in legal practice. The Court noted that a language model such as ChatGPT, despite its impressive capabilities, is “not capable of conducting reliable legal research.” These tools may produce answers that look coherent and plausible, but the output can be entirely incorrect, confidently citing sources that do not exist, or quoting passages that were never written. Indeed, the very errors seen in Ayinde and Al-Haroun (invented case names, wrong quotations) illustrated this phenomenon of AI hallucinations in a legal context.

The Court made it clear that lawyers who choose to use AI for research or drafting must do so responsibly. Any information or citation coming from an AI assistant must be rigorously cross-checked against reliable primary sources before being relied upon. For example, if an AI suggests a particular case as authority, the practitioner should verify that case via authoritative databases, such as legislation.gov.uk for statutes, the National Archives or official law reports for cases, or established legal databases from reputable publishers. This duty of verification isn’t new, the Court stressed, it is simply a modern application of the lawyer’s fundamental duty to exercise due diligence and to not mislead the court. Crucially, the duty applies not only to the person who directly uses the AI tool, but also to any lawyer who relies on or adopts another’s work that was AI-generated.

The judgment also spoke to the wider legal profession and institutions. The judges expressed concern that incidents like these, if unaddressed, could undermine public confidence in the justice system. They noted that misuse of AI in litigation has serious implications, so law firm partners, heads of chambers, and others in leadership positions must actively ensure that those they supervise understand their obligations. The Court signalled that in any future hearings of this kind, it will inquire into what preventative measures senior lawyers have in place, for example, training and internal protocols regarding AI use. This was effectively a call for legal organisations to instill a culture of compliance: embracing technological tools must go hand-in-hand with reinforcing ethical standards.

The Court reviewed existing professional guidance on AI and noted that there is no lack of advice from regulators, but mere guidance, by itself, has limits. It cited, for instance, the Bar Council’s January 2024 guidance, which warned that large language models can generate “convincing but false content” and admonished lawyers “Do not… take such systems’ outputs on trust and certainly not at face value.” Misleading the court, even inadvertently via AI, would still amount to incompetence or negligence on the lawyer’s part, breaching core professional duties. Likewise, the Solicitors Regulation Authority had flagged the risks of relying on AI without proper verification. Despite this, the Court observed that simply publishing guidance is not enough to prevent mishaps. In a further step, the judgment invites the Bar Council, the Law Society, and the Inns of Court to consider urgent further action in light of these incidents. This might include enhanced training, stricter rules, or other measures to ensure compliance. The underlying message is that the legal profession must be proactive and united in addressing the challenges posed by AI, so that technology serves the interests of justice, rather than inadvertently undermines them.

Conclusion

The joined judgment in Ayinde and Al-Haroun is a cautionary tale about the pitfalls of uncritical reliance on AI in legal work. In both cases, well-intentioned uses of technology led to serious errors, fictitious cases and quotations, which almost misled the court. The High Court’s response was measured: it stopped short of holding the individuals in contempt, instead opting to refer the lawyers involved to their respective professional regulators for investigation. and the lawyers themselves.

Embracing innovative tools like AI is not forbidden, indeed, such tools can assist with research and efficiency, but they cannot replace the essential judgment and oversight of a trained lawyer. No matter how advanced legal technology becomes, fundamental duties of accuracy, honesty and diligence remain paramount. A solicitor or barrister must never present unchecked information to a court, whether that information came from a junior colleague, a client, or an AI program. The Ayinde/Al-Haroun case confirms that if these duties are breached, even through carelessness or unfamiliarity with new technology, the courts will take action to uphold professional integrity (via wasted costs orders, referrals, or other more serious sanctions).

If you require advice on issues relating to Professional Negligence or specifically Solicitors Negligence, contact our experienced team at Carruthers Law today. Call 0151 541 2040 or email info@carruthers-law.co.uk.

Suite 205/206 Cotton Exchange
Bixteth Street, Liverpool L3 9LQ

T — 0151 541 2040
T — 0203 846 2862
info@carruthers-law.co.uk