AI in legal research under scrutiny after fake case citations

Posted on

A law firm’s potential use of fictitious case law references, possibly generated by artificial intelligence, has raised serious concerns and could spark an investigation by the Legal Practice Council (LPC).

The case Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others arose from an application filed on 16 August last year by politician Philani Mavundla, who has been the mayor of the Umvoti Local Municipality in KwaZulu-Natal since 2023.

The application sought leave to appeal a previous judgment. In that earlier ruling, the court had granted the first respondent, the MEC for the Department of Co-Operative Government and Traditional Affairs in KwaZulu-Natal, a reconsideration application and rescinded an order that had initially been made in Mavundla’s favour.

On 6 September 2024, a supplementary notice of application for leave to appeal was submitted. This notice included not only the grounds of appeal but also several case law references to support the arguments raised.

The MEC opposed the application for leave to appeal.

Judge Elsje-Marié Bezuidenhout delivered the ruling on 8 January, dismissing Mavundla’s application for leave to appeal with costs. However, the most striking aspect of this case was not necessarily the judge’s reasons for her findings but the discovery that most of the legal references cited in the supplementary notice were fictitious.

In the application, Mavundla’s counsel, S Pillay (employed by Surendra Singh and Associates), referred to the judge’s findings on the issue of joinder. The judge had previously held that the applicant had failed to join the municipal councillors who had requested the MEC to convene a meeting to address motions for the removal of Mavundla and the speaker.

Pillay cited a case titled Pieterse v The Public Protector, claiming it originated from a Gauteng matter and arguing that non-joinder was not fatal. However, as Judge Bezuidenhout prepared her judgment, it became apparent that no such case existed in the South African Law Reports, the All South African Law Reports, or on the SAFLII database.

To investigate further, the judge tasked two court law researchers with reviewing the supplementary notice of appeal and verifying the cited cases. Of the nine cases referenced, only two were found to exist, and one of those had an incorrect citation.

Concerned about the validity of the legal authorities cited, the judge emailed Pillay on 18 September 2024, requesting copies of the cases cited.

On 20 September, both Pillay and De Wet SC, counsel for the MEC, appeared in court. Pillay requested an adjournment, explaining that she had been unable to obtain the cases in time.

The judge informed her that it appeared the cited cases did not exist. Pillay defended herself, claiming the case references were provided by the firm’s article clerk, Rasina Farouk, who had drafted the supplementary notice of appeal.

Farouk, now referred to as a candidate legal practitioner, later appeared in court and stated that she had sourced the cases from law journals accessed through her UNISA portal. When asked to specify which journals, she could not provide an answer. She denied using artificial intelligence tools such as ChatGPT and Meta for her research and requested additional time to review her search history.

The judge offered a practical solution, suggesting they visit the High Court library to retrieve the actual law reports for the cited cases. However, upon resumption, Surendra Singh, the proprietor of the firm, appeared and claimed the library had requested payment for copies of the cases, which he was unwilling to make. He insisted they had found one of the cases on an employee’s cellphone during the adjournment but requested more time to produce the remaining cases. The judge granted an adjournment to 25 September 2024.

On 25 September, Singh again appeared before the court, stating that, as an elderly practitioner, he faced challenges in locating the cases and had relied on Google for assistance. He admitted his firm did not have access to the South African Law Reports or other referenced sources. No further mention was made of the law journals Farouk claimed to have used.

Ultimately, the cases could not be produced.

Adding to the issue, counsel for the MEC, De Wet, admitted that he had only attempted to verify the first or second case cited in the supplementary notice of appeal. When he was unable to locate them, he chose to wait until the hearing, intending to address the issue if the cases were relied upon during arguments.

Counsel’s duty to court

In her judgment, Judge Bezuidenhout cited several cases and highlighted the counsel’s duty to the court as outlined in Rule 57.1 of the Code of Conduct for all Legal Practitioners, Candidate Legal Practitioners, and Juristic Entities:

“A legal practitioner shall take all reasonable steps to avoid, directly or indirectly, misleading a court or a tribunal on any matter of fact or question of law. In particular, a legal practitioner shall not mislead a court or a tribunal in respect of what is in papers before the court or tribunal, including any transcript of evidence.”

Bezuidenhout stated that this principle should extend to ensuring that courts can assume and rely on counsel’s implicit representation that the authorities cited and relied upon actually exist.

The judge noted that Pillay had “blindly relied on authorities provided to her by Farouk, without checking the references when addressing me at the initial hearing”. She also criticised Singh’s firm for issuing the supplementary notice of appeal, which was drafted by a candidate legal practitioner, without anyone verifying whether the document was properly prepared and whether the cited authorities were correct or even existed.

The use of AI in the legal profession

Judge Bezuidenhout referenced an article by Associate Professor M. van Eck titled “Error 404 or an Error of Judgment? An Ethical Framework for the Use of ChatGPT in the Legal Profession”. The article provides a comprehensive analysis of the legal position on using artificial intelligence technologies in legal research, focusing particularly on ChatGPT, in South Africa and other international jurisdictions.

Van Eck noted that although ChatGPT promises efficiencies and benefits in the legal sector, it is unreliable because, as he explained, “information produced in response to prompts has been shown to be fabricated or fake, especially when such prompts relate to legal information”.

Turning to the case at hand, Judge Bezuidenhout observed “an inordinate amount of legal and judicial resources were spent to find the authorities referred to in court by Ms Pillay as well as in the supplementary notice of appeal”.

To test the reliability of AI in this context, Judge Bezuidenhout conducted a brief experiment using ChatGPT, inputting the citation of the supposed Pieterse case.

“The system responded that the case did indeed exist and revolved around the powers of the Public Protector,” she noted.

When asked if the case addressed the issue of non-joinder of councillors, ChatGPT confirmed this, which Bezuidenhout stated “immediately illustrated the unreliability of it as a source of information and legal research”.

“In my view, relying on AI technologies when doing legal research is irresponsible and downright unprofessional,” Judge Bezuidenhout said.

However, Tristan Marot, an associate at Norton Rose Fullbright, believes this view oversimplifies the issue. In an article published on the firm’s website, Marot noted that although this case highlights the dangers of relying on AI as a primary source of legal knowledge, it should not discourage the appropriate and responsible use of such tools in legal practice.

“When employed thoughtfully, AI can augment practitioners’ capabilities, streamlining research, identifying patterns, and even, in appropriate circumstances and with the right tools, generating initial drafts,” he explained.

He further emphasised that these outputs must always be rigorously reviewed and verified against authoritative sources.

“Of course, this requires investment by practitioners, and the law firms within which they operate, into education and into the more advanced tools available for these purposes.”

Marot also pointed out that the legal profession must balance innovation with accountability in light of these new technologies.

“AI is not a substitute for the judgment and expertise that define professionals,” he said. “It is a tool that can enhance those qualities when used correctly. The lesson here is not to reject AI outright but to integrate it responsibly by investing in training and fostering a culture of ethical vigilance.”

Judge Bezuidenhout directed the registrar to forward a copy of the judgment to the Legal Practice Council (KwaZulu-Natal Provincial Office) for its attention and any further necessary action.

Additionally, Surendra Singh and Associates were ordered to bear the costs for the additional court appearances on 22 and 25 September 2024.