Generative AI tools have become widely available in the public arena, including for the judicial sector and the legal profession. Best practice guidance on the use of AI in legal practice for the use of the judiciary have been promulgated in a number of countries including Brazil (2020), Canada (2023-2024), New Zealand (2023), Australia (2023-2024) and the United Kingdom (2023).
Chief Justice Andrew Bell of the Australian Supreme Court of New South Wales in May 2024 marked the bicentenary of Australia’s oldest court and reflected on the challenges and potency of artificial intelligence. In his interview with the ABC, he flagged AI as one of the more immediate challenges facing the justice system.
IFIP President Anthony Wong was invited to join the UNESCO panel on the “The Future of AI in the Judiciary” at the recent WSIS 2024 in Geneva. Generative AI tools can help judges, prosecutors, lawyers, civil servants in legal administration, and researchers improve the quality of their work by facilitating the search for information, automating tasks, and supporting decision-making processes.
Although AI tools can support the core objectives of the justice sector, the negligent use of AI systems by judicial operators may also undermine human rights, such as fair trial and due process, access to justice and effective remedy, privacy and data protection, equality before the law, and non-discrimination, as well as judicial values such as impartiality and accountability.
This session also discussed:
- Challenges and opportunities for the use of AI in the Judiciary;
- Human rights implications of AI that the judiciary must be prepared to address; and
- The contributions from the audience in shaping the Guidelines for the Use of AI Systems in Courts and Tribunals.
UNESCO also launched the results of its Survey on the Use of AI by Judicial Operators at the session. This Survey received responses from over 500 judicial operators from 96 countries concerning their use of Generative AI. A majority of the respondents in the Survey indicated the need for guidelines for judges and their respective institutions on the use of AI within judicial contexts.
The UNESCO Survey will be the basis of the UNESCO Guidelines for the Use of AI Systems in Courts and Tribunals, which are currently undergoing expert and multistakeholder consultations.
It is important to note that these AI tools are not a substitute for qualified legal reasoning, human judgment or tailored legal advice due to the limitations of the current state-of-the-art. Generative AI tools are built on Large Language Models (LLMs) which are trained using massive amounts of training data to predict the probability of the next best word in a sentence given the context. Bender et al. describe LLMs as “stochastic parrots” because this type of language model stitches together “sequences of linguistic forms it has observed in its vast training data, according to probabilistic information about how they combine, but without any reference to meaning” (2021: 616).
As the famous legal Latin maxim “caveat emptor” applies (Latin: “let the buyer beware”), or is it “let the user beware”?