By Paul Gregoire and Ugur Nedim
Generative artificial intelligence, or Gen AI, is part of the broader AI computer science, which is geared to machine learning and problem-solving, and it focuses on the creation of new content, which can include visual images, audio or text. And there are a number of well-known platforms that can be used to produce such material, with ChatGPT being the most widely recognised.
AI is a technology in its early stages. This triggers concerns about attempts to apply it in official settings when it’s not able to adequately replicate the work of a human. And one sector where false information produced by Gen AI could have serious consequences is in the courts.
A case that came before Federal Circuit and Family Court Judge Amanda Humphreys in July 2024 was demonstrative of this, as the proceedings involved a family law lawyer producing a document for the court that listed authorities, or prior court cases, that supported the position he would argue, which her Honour then found on closer perusal was a list of past hearings that didn’t exist.
And whilst this instance of a legal practitioner applying AI to generate details in a hurried manner might not have caused any substantial issues later in proceedings, it’s the suspect AI generated materials that don’t get called out that might have more detrimental impacts at a later point.
So, in an effort to prevent the misapplication of Gen AI in the NSW court system, NSW Chief Justice Andrew Bell released Supreme Court Practice Note SC Gen 23 – Use of Generative Artificial Intelligence on 21 November. And this document sets out the limits to how Gen AI can be applied within this state’s court system, and it further contains a general AI guidance for NSW judges.
A note on practice
“Generative AI is a form of artificial intelligence that is capable of creating new content, including text, images or sounds, based on patterns and data acquired from a body of training material,” begins the practice note. “That training material may include information obtained from ‘scraping’ publicly and privately available text sources to produce large language models.”
“Gen AI is capable of being used to assist legal practitioners and unrepresented parties with various tasks, including drafting documents and summarising information,” the guidance continues. “This practice note is directed to the circumstances where such use is acceptable.”
The guidance outlines that Gen AI does not include applications that correct spelling or grammar, assist with transcription or generates chronologies from preexisting source documents, while it further makes certain that it does not have any bearing on the use of internet search engines, like Google, and neither does it have any implications for “dedicated legal research software”.
A key warning in respect of legal issues that could be generated by the use of this cutting-edge technology is that data being feed into “chatbots” can be used to train the larger language model involved in the program and therefore, confidential information may become available to others.
So, there is a general prohibition regarding Gen AI and the NSW courts, which involves a ban on entering of information into a program that is content that’s the subject of a nonpublication or suppression order, as well as any information specifically produced for the courts without approval to utilise this or any material produced on subpoena or that’s the subject of a statutory prohibition.
The potholes involved in Gen AI
The document warns legal practitioners and unrepresented parties about the “limits, risks and shortcomings” that can be involved in Gen AI that they ought to be aware of.
These include “hallucinations”, which is the generation of “apparently plausible coherent responses”, which are actually “inaccurate or fictitious” and as an example, it provides false citations and fabricated legislative or case references.
Further, Gen AI relies on preexisting datasets, which can be outdated, incomplete or actually include misinformation, which means these programs can be propagating falsehoods that are already circulating in the public sphere. And the nature or scope of the datasets underlying AI Gen content can lead to “biased or inaccurate output”.
Another issue is that unless the user prevents it, the searches and content generated via artificial intelligence then goes into the greater pool that can be used to produce more computer-generated content.
The NSW Chief Justice’s note further warns that there is a “lack of adequate safeguards to preserve the confidentiality, privacy or legal professional privilege that may attach to information or otherwise sensitive material submitted to a public Gen AI chatbot”. And further, there are no guarantees that information utilised in generating content is not the subject of copyright.
Limits and allowances
Chief Justice Bell then notes other limits to how Gen AI can be applied to NSW court work. These involve a prohibition on generating “affidavits, witness statements, character references or other material” that is supposed to reflect witness opinion, as these opinions should reflect a witness’ own knowledge and not that generated by a machine.
AI should not be used to alter, embellish, strengthen or dilute witness statements. So, as of 3 February 2025, when the practice note takes effect, all affidavits, witness statements or character references “must contain a disclosure that Gen AI was not used in generating” them.
Under exceptional circumstances, however, Gen AI can be used to produce affidavits, witness statements or character references, and for this to be done, reasons for its use must be detailed, the chatbot program and its version that is used to produce it must be specified, as well as whether open-source or closed-source materials have been involved and if any of the content is confidential.
In terms of written submissions, summaries or skeletons of arguments produced by Gen AI, lawyers must verify cited citations, legal or academic authorities, case law or legislative references to ensure that they exist, are accurate and relevant to proceedings, and this verification must be done by a person.
AI use doesn’t absolve lawyers from being held accountable for any negative outcomes produced.
Exports reports, however, cannot be generated by AI, unless a lawyer seeks leave to do this and stipulates the benefits that would be gained by doing so.
If Gen AI is used in the production of an expert opinion that should be specifically explained to the court and any references within it should be verified. And if a report is in relation to a professional negligence claim, any use of Gen AI must be squared up at the first directions hearing on the matter.
“Legal practitioners and unrepresented parties must draw the requirements of this practice note to the attention of experts when instructing them,” the NSW Chief Justice adds.
A note to those on the bench
Chief Justice Bell then provides a further practice note titled Guidelines for New South Wales Judges in Respect of Use of Generative AI, which was released on the same day as the lengthier NSW Supreme Court practice note and was produced with input from heads of jurisdiction, or the most senior judicial officers from each of the courts.
The guideline is clear that NSW judges cannot use Gen AI to produce their reasons for a judgement or to assess evidence in order to produce their reasons, nor can artificial intelligence be utilised in editing or proofing judgements or any part of a judgement.
If judges seek to apply the use of this technology for secondary sources, then judicial officers ought to familiarise themselves with the pitfalls of Gen AI use, while anything produced as a result of AI use should be verified afterwards regardless of how polished it appears. And judges should require their assistants to disclose any AI use, and they can quiz litigants and lawyers over any use.
AI red flags that judges ought to be aware of include inaccurate or false case citations, dodgy assessments, case references that aren’t suitable for the jurisdiction involved, out of date references, submissions that diverge from general understandings, the use of repetitive language and the use of terms more closely related to other jurisdictions.
“Due to the rapidly evolving nature of Gen AI technology, these guidelines will be reviewed on a regular basis,” NSW Chief Justice Andrew Bell ended.