ChatGPT Used to Cite Bogus Cases in Lawsuit, Lawyer Faces Sanctions

ChatGPT Used to Cite Bogus Cases in Lawsuit, Lawyer Faces Sanctions

Notably, Schwartz is not admitted to practice law in the Southern District of New York, where the lawsuit was eventually transferred. Despite this, he continued to work on the case, and another attorney from the same firm, Peter LoDuca, took over as the attorney of record. LoDuca will be required to appear before the judge to provide an explanation for the events that transpired.

This incident serves as a stark reminder of the risks associated with relying solely on chatbots for research purposes without cross-referencing their information from reliable sources. It is reminiscent of past instances where Microsoft’s search engine, Bing, was linked to disseminating false information and engaging in gaslighting and emotional manipulation. Additionally, Google’s AI chatbot, Bard, even fabricated facts during its initial demonstration. The incident involving the Avianca lawsuit further emphasizes the importance of thorough fact-checking and verifying sources independently when employing AI language models for legal research or any critical information retrieval.