find-partner-btn-inner

AI: friendly legal advisor or foolhardy foe?

It is no secret that AI is now widely used and, in some professions, commonplace. The legal world is no exception. AI chat functions and search engines can be useful tools for individuals who want to understand their legal rights better, or who choose to represent themselves in Court. AI search engines appear to deliver answers to complex legal questions in a matter of seconds, with supporting case law, all for free and from the comfort of your own sofa!

Seems too good to be true? It is. There is always a catch.

There has now been not one, not two, not three, but four reported cases where fake judgments have been argued in Court, and another case in which fraudsters have sought to enforce a fake arbitration award. In the latest case of Bandla v Solicitors Regulation Authority [2025] EWHC 1167, a solicitor appealing the Solicitors Disciplinary Tribunal’s decision to remove him from the solicitor’s register referenced 25 fake authorities in support of his appeal. What AI had told him was a strong appeal, was struck out. The judge found that the solicitor had abused court process and ordered him to pay costs of £24,727.20, noting that the “Court needs to take decisive action to protect the integrity of its processes against citation of fake authority”.

The Bandla case follows another three reported cases where AI has returned false authorities to litigants. In Ayinde v The London Borough of Haringey [2025] EWHC 1040, 5 fake cases were cited. Haringey Law Centre (who found the authorities), and the barrister who included the fake judgments in her pleadings, were both ordered to pay £2,000 in costs. The judge commented that: “it would have been negligent” if a lawyer “used AI and did not check it” before including them in legal documents.

Lawyers are trained to research the law. They know how, and where, to locate supportive authorities, they know what a judgment should look like, and they know how cases are typically cited. Some of the cases these AI search functions have return bear very similar fact patterns to other genuine cases. However, the names and dates of the case AI miraculously returns, are different. The discrepancies can be hard to spot for someone who does not know what to look out for, or where to doublecheck their authenticity. In the case of Contax Partners Inc Bvi v Kuwait Finance House & Ors [2024] EWHC 436 (Comm), by the time the party seeking to enforce the fake award was found out, the defendants had already spotted the fake award was a skilfully edited copy of a judgment of Picken J in Manoukian v Société Générale de Banque au Liban SAL [2022] EWHC 669 (QB). One might think that an AI tool was used or could be used to unearth such deceits in the future.

In an American case, Mata v Avianca 22-cv-1461 (PKC), a lawyer referenced fake judgments he had found on ChatGPT. To check the cases returned were genuine, the lawyer replied to ChatGPT asking if the cases were real. ChatGPT diligently responded confirming the case “does indeed exist” and confirmed it could be found on Westlaw and LexisNexis (two legitimate and widely used resources for legal research). The lawyer assumed ChatGPT “could not possibly be fabricating cases on its own” and included them in his submission. However, the judge noticed “stylistic and reasoning flaws” in the opinions found by ChatGPT that did not “generally appear in decisions issued by United States Courts of Appeals”. The lawyer who cited the fake judgments was sanctioned and fined $5,000.

When the discrepancies are hard for those trained in legal research to spot, it can only be harder for those that are not, leaving them much more vulnerable to AI tools.

While the Court appears to be more lenient to litigants in person citing fake judgments, such leniency may not last for long should AI continue to cause issues in the courtroom. In Harber v Revenue and Customs Commissioners [2023] UKFTT 01007 (TC), a litigant in person, Mrs Harber, cited fake authorities when appealing a penalty imposed by HMRC for not declaring her Capital Gains Tax liability. When asked, Mrs Harber could not give further details of the cases or provide their reference numbers. HMRC’s legal team searched the authorities cited and could not find them. They did, however, find that some of the cases had similar facts to genuine judgments, albeit the names and dates of the cases were different. The judge found that the cases were not genuine and “had been generated by an AI system such as ChatGPT”. Mrs Harber’s appeal was dismissed and the penalty for not reporting her CGT liability was confirmed. However, luckily for Mrs Harber, the judge did not impose any further cost penalty for citing fake authorities noting that, while citing fake judgments wastes “time and public money”, Mrs Harber “did not know how to locate or check case law authorities by using the FTT website, BAILLI or other legal websites”.

AI tools may be a good starting point for litigants in person, but their findings should be approached with extreme caution. The next litigant in person to cite fake judgments in Court without having first checked those authorities, may not be as lucky as Mrs Harber.

If you are going to use AI as a research tool to find case law, we suggest you also do the following:

  • Try to locate the full judgment. You could search the case name, or citation on BAILLI, LawCite or the National Archives "Find Case Law Service".
  • Run a general search for the case name elsewhere (not another AI platform, or by asking the same platform if the case it found is real!). The search may return other legal articles that discuss the case, and which may contain a link to the judgment. If there is not much commentary, the case may not be genuine.
  • Check the citation. Does it look real? Citations typically look like this: [year], [capital letters], [numbers], sometimes with further letters in circular brackets afterwards, like this: (XYZ).
  • Search the citation to check whether it accords with the case name found, or whether it is reported anywhere else.
  • If you are at all unsure about your legal position or find yourself needing to go before a Court, we would always encourage you to seek specialist legal advice.

Featured Insights