Introduction

Introduction

The courts in New Zealand have released guidance on the use of generative artificial intelligence (GenAI) in courts and tribunals. The guidance is set out in three documents. One is aimed at members of the judiciary including judicial support staff, the second at lawyers and the third at non-lawyers using courts or tribunals.

The guidelines caution against heavy reliance on GenAI in relation to certain tasks and contain information on the risks of GenAI use.  The guidelines cover five key topics that are common across all of them. In addition, there is a more targeted message in each of the three documents to each group of court user.

Guidance topics

The general guidance on AI can be summarised under the five different topics below.

Understanding GenAI and its limitations

The guidelines set out the key limitations to consider when using GenAI, specifically the following:

  • GenAI is not a search engine and its output is not always factually correct.
  • The GenAI chatbots appear to have limited access to New Zealand law or the procedural requirements of our courts and tribunals.
  • What the chatbot puts out reflects how it has been trained and the prompts being used by the person employing it. Even with the
  • best prompts, the output may be inaccurate, incomplete, misleading or biased.

 

Uphold confidentiality, suppression and privacy

The guidance cautions about the security of GenAI not being guaranteed. It recommends that parties not enter any information into it that is not already in the public domain, as that could risk breaching suppression orders or statutory prohibition on publication, or disclose private, confidential or sensitive information.

The guidelines require court staff to notify the appropriate people as soon as possible if confidential information is disclosed.

Ensure accountability and accuracy

The guidelines make it clear that the user is responsible for the work they do using GenAI, including reading to check the accuracy of any information provided by a GenAI chatbot as it can make up false cases, citations or quotes.

Be aware of ethical issues including inherent bias

They provide that GenAI tools will likely contain the biases of those who trained it. In addition, they do not generally account for New Zealand's cultural differences to overseas jurisdictions, particularly the values and practices of Māori and Pasifika.

Disclosing GenAI use

The guidelines do not impose any requirement on lawyers and court users to disclose whether GenAI has been used. However, court staff carrying out research or administrative duties are required to discuss with their supervisor how they are using GenAI tools.

Additional guidance

In addition to the above five topics, the overarching message in the guidance for members of the judiciary and their staff is that any use of Gen AI must be consistent with their obligation to protect the integrity of justice and the court or tribunal processes. They are required to maintain security by ensuring use of work devices and accounts. The guidelines caution judicial staff to be alert to lawyers and lay litigants using AI tools in producing court documents, which could lead to these tools can lead to a reduction in the accuracy of the information contained in the court documents. The guidelines also set out some examples of where GenAI could be useful which include:

  • summarising information;
  • speech writing; and
  • administrative tasks (eg, drafting emails).

The guidelines say that GenAI is not recommended to be used for legal research or legal analysis more broadly. The overarching message to lawyers is that they remain bound by their professional obligations. They are reminded that their first duty is as an officer of the court, that they cannot mislead the court and that they must take all reasonable steps to avoid breaching suppression orders of client confidentiality. The key point directed at non-lawyers is that GenAI is not a substitute for a qualified lawyer and cannot give tailored legal advice. While GenAI can be helpful for explaining legal terminology and principles, it should not be relied on as a sole source of legal information.

Comment

These New Zealand guidelines were very similar to the guidelines released in England and Wales by the courts and tribunals judiciary in December, although they relate to all court users, not just judicial office holders. GenAI is expected to improve dramatically in capability over the next few years – if that occurs it may be that these guidelines will need revisiting, particularly if the flaws of GenAI can be addressed (eg, bias, privacy, confidentiality or hallucinations).