Artificial Intelligence in the Courtroom: Implications for New Zealand’s Litigation Landscape
11 Jul 2025
Questions regarding the appropriate use of artificial intelligence (“AI”) in New Zealand’s legal system have arisen in two recent New Zealand cases. The first was the Court of Appeal’s decision in Wikeley v Kea Investments Ltd [2024] NZCA 609.1 In that case, a memorandum filed by Mr Wikeley (a self-represented litigant) was later withdrawn after opposing counsel advised that the memorandum had been drafted using generative AI2, and referred to “apparently non-existent case law”.3 Another self-represented litigant before the Employment Court referred to a judgment in her submissions, that resulted in the Employment Court stating “no such case exists, and the plaintiff is reminded that information provided by generative artificial intelligence ought to be checked before being relied on in documents filed in court proceedings”.4
There have been similar instances overseas, including where lawyers have used generative AI to create court documents.5 A recent judgment from the Divisional Court in the United Kingdom dealt with two cases, where it was suspected the lawyers had used generative AI to produce court documents without checking information contained within them.6 In one of the cases, a court document contained references to five cases that did not exist. In the other case, 45 citations had been put before the Court, and of those, 18 cited cases that did not exist.7 In a similar example, an Australian immigration lawyer was referred to the Office of the NSW Legal Services Commissioner after it was discovered that he had used ChatGPT to write court documents citing and quoting cases that did not exist.8
Notwithstanding these risks posed by AI use in legal proceedings, it is clear that it is already a valuable tool for the profession and, when used effectively, can be safe and efficient. However, cases such as these serve as a reminder of the need for caution for both lawyers and non-lawyers.
What is New Zealand doing?
The Courts of New Zealand have released guidelines on the use of generative AI in courts and tribunals.9 In short, these guidelines focus on issues associated with the use of AI, including the faults, limitations and biases of AI, and the needs to: uphold privacy and confidentiality, to be aware of ethical issues, and ensure accountability and accuracy when using AI. There are three sets of guidelines: one for judges, judicial officers, tribunal members and judicial support staff, one for lawyers and one for non-lawyers. The guidelines are a useful starting point, but are guidelines only and do not address all issues that can arise when AI is used in our courts. In early 2024, the New Zealand Law Society also released guidelines for lawyers relating to AI.10
A more recent development is the Law Society’s launch of the “AI Research Project 2025”, in partnership with LexisNexis. The project aims to better understand the profession’s use of AI (via a survey), and to use this knowledge to provide further support and resources.
What does the law say?
There are no statutes or regulations which focus solely on restricting or limiting the use of AI in our courts or justice system. However, if lawyers do adopt AI into their practice, there are various duties contained in the Lawyers and Conveyancers Act (Lawyers: Conduct and Client Care) Rules 2008 which will be important for lawyers to bear in mind. For example, the duties for lawyers to act competently and to take reasonable care, to maintain confidentiality, to avoid misleading the court and to provide independent judgement.
These duties cannot be outsourced to AI. In Ayinde v The London Borough of Haringey and Al-Haroun v Qatar National Bank, the Divisional Court stated that those using AI have a professional duty to check the information provided, and that such duty rests both on the lawyers who have conducted work using AI, and on those who relied on the work of others who have used AI. The Court considered “this is no different from the responsibility of a lawyer who relies on the work of a trainee solicitor…”11 The Court indicated that including fake citations would, at a minimum, probably result in a wasted costs award and referral to a regulator.
What might be next for AI in NZ?
Over the coming years, AI is likely to exert increasing influence over how litigation is conducted in New Zealand. A report from March 2025 on the impact of AI, noted the overall increase in the use of AI in New Zealand.12 While ‘only’ 7% of respondents had observed AI replacing workers, 40% of the respondents reported less need to hire new employees because of AI.13 The report also discussed the competitive advantage that AI can provide, and the risks that businesses which fail to embrace AI may face.
From a lawyer’s perspective, these changes will affect all stages of the litigation process. The use of AI at certain stages of litigation, such as the discovery process (with, for now, varying degrees of usefulness) has been present for some time already. There are multiple AI products being marketed to lawyers for legal research, document management and bundling and document creation.
Over the coming months and years, we expect to see an increase in the quantity and quality of AI platforms available for legal research, along with other steps in the litigation process, which are more focused in a New Zealand context.
Managing the way AI is used in the preparation of evidence (both factual and expert) is likely to require serious consideration from the profession. For example, we may begin to see witnesses using AI (if they are not doing so already) when preparing briefs of evidence. In addition to affecting the way lawyers produce documents and prepare for trial, AI may also have an impact on the way court processes are run. The Singapore Courts are in the process of working with an American legal AI programme to pilot the use of generative AI for users of the Small Claims Tribunals. This may provide a model for other jurisdictions to consider.
With these changes (and others we cannot yet conceive of), we expect the court and NZLS guidelines will continue to evolve. They may necessarily become more directive, or require amendments to primary or delegated legislation or regulations such as the High Court Rules. Potential regulations could include introducing requirements to disclose where AI has been used in the preparation of a document or requiring lawyers to certify that any AI generated content contained in a court document has been verified. It may also be necessary to consider whether training in the ethics of using AI should be incorporated into legal education both for qualification and continuing professional education.
Given the cases that have arisen in New Zealand to date (addressed above), specific steps to ensure that lay litigants are made expressly aware of the guidelines, and the need for caution when using AI in the courts, may also be required.
AI and its potential influence on litigation (and most other areas of life) is expanding rapidly. While it can streamline processes and sharpen insights, it also raises concerns about fairness, transparency, and accountability. For those lawyers with leadership responsibilities within the profession, it will be important to consider comments from the Court in Ayinde, that “practical and effective measures must now be taken by those within the legal profession with individual leadership responsibilities (such as heads of chambers and managing partners) and by those with the responsibility for regulating the provision of legal services…For the future… the profession can expect the court to inquire whether those leadership responsibilities have been fulfilled”.14
Endnotes
2 Generative AI is a type of AI that can create new content. Generative AI systems learn patterns and structures from large datasets, and then use that knowledge to generate outputs.
3 Wikeley v Kea Investments Ltd [2024] NZCA 609 at fn 187.
4 LMN v STC [2025] NZEmpC 46 at [8].
5 In Mata v Avianca, Inc, F Supp. 3d, 22-cv-1461 (PKC) (S.D.N.Y), 2023 WL 4114965, a New York lawyer used AI to complete his legal research, and consequently filed submissions containing references to non-existent case law. In a similar example, an Australian immigration lawyer was referred to the Office of the NSW Legal Services Commissioner after it was discovered that he had used ChatGPT to write court documents citing cases that did not exist.
6 Ayinde v The London Borough of Haringey and Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin).
7 Ayinde v The London Borough of Haringey and Al-Haroun v Qatar National Bank at [74].
8 Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95.
9 See https://wilsonharle.com/publications/introduction.
10 https://www.lawsociety.org.nz/assets/Professional-practice-docs/Rules-and-Guidelines/Lawyers-and-AI-Guidance-Mar-2024.pdf
11 Ayinde v The London Borough of Haringey and Al-Haroun v Qatar National Bank [2025] EWHC 1383 (Admin) at [8].
12 https://aiforum.org.nz/wp-content/uploads/2025/03/AI-in-Action_March2025-Report-compressed.pdf.
13 At p 16.
14 Ayinde v The London Borough of Haringey and Al-Haroun v Qatar National Bank at [9].