A lawyer recently used ChatGPT to carry out legal research for a court case, which led to the filing of false information. The case was a man(Roberto Mata) suing an airline(Avianca ) after a serving cart injured his knee during a flight.
The plaintiff’s legal team submitted a brief referencing many earlier court decisions to support their point. However, the airline’s attorneys learned that some of the cases mentioned in the reference were false, and they immediately informed the sitting court.
Judge Kevin Castel, who presided over the case, expressed surprise at what had happened, calling it a “unprecedented circumstance.” In an order, the judge ordered an explanation from the plaintiff’s legal team.
The lawsuit was initially brought in state court, and Mata’s lawyer, Schwartz, claimed he made use of OpenAI’s well-known chatbot ChatGPT to “supplement” his own research when it was moved to Manhattan federal court.
A lawyer used ChatGPT to do "legal research" and cited a number of nonexistent cases in a filing, and is now in a lot of trouble with the judge 🤣 pic.twitter.com/AJSE7Ts7W7
— Daniel Feldman (@d_feldman) May 27, 2023
Schwartz received many references to cases that were comparable from ChatGPT, including Varghese v. China Southern Airlines, Shaboon v. Egyptair, Petersen v. Iran Air, Martinez v. Delta Airlines, and Estate of Durden v. KLM. United Airlines v. Miller and Royal Dutch Airlines.
However, further investigations showed that the case was false, and were entirely made up by ChatGPT.
The defense team for Avianca and the judge in charge of this case quickly found they were unable to find any of these court rulings.
In light of this incident, both lawyers who were involved in the case, Peter LoDuca and Steven Schwartz of Levidow, Levidow & Oberman, have been summoned to a disciplinary hearing on June 8 to explain their behavior.
This incident has sparked debate in the legal profession about the acceptable use of AI technologies in legal research and the necessity for thorough rules to prevent such incidents.
Source: nytimes