时间:2025-07-02 09:11:39 来源:网络整理编辑:娛樂
ChatGPT couldn't have written a better outcome for the lawyers who used the AI chatbot to file a law
ChatGPT couldn't have written a better outcome for the lawyers who used the AI chatbot to file a lawsuit filled with citations of completely non-existent cases.
On Thursday, a federal judge decidednot to impose sanctions that could've derailed the careers of attorneys Steven Schwartz and Peter LoDuca of the law firm Levidow, Levidow & Oberman.
Judge P. Kevin Castel instead let the lawyers off with a slap on the wrist: A $5,000 fine for acting in "bad faith."
Basically, the judge decided to impose a fine on the two attorneys for “shifting and contradictory explanations” and lying to the court at first when trying to defend the legal filing they submitted which cited six cases that simply did not exist.
Castel also orderedthe lawyers to notify the judges that were cited in their error-laden legal filing as the authors of the six fake cases created whole cloth by ChatGPT. While the cases were made-up, the judges that ChatGPT attached to them all exist.
The judge felt the subsequent apologies from the lawyers sufficed and did not warrant further sanctions.
In his ruling, Castel noted that he didn't have a problem with the use of AI in law. However, the lawyers were negligent in their duty to make sure the research was accurate.
“Technological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance,” the judge said. “But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”
SEE ALSO:The dark web is overflowing with stolen ChatGPT accountsWhile things could've gone much worse for Schwartz and LoDuca, the law firm is considering an appeal.
“We respectfully disagree with the finding that anyone at our firm acted in bad faith," Levidow, Levidow & Oberman said in a statement. "We have already apologized to the Court and our client. We continue to believe that in the face of what even the Court acknowledged was an unprecedented situation, we made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth.”
This saga began when a client of the firm wanted to sue an airline after they allegedly injured their knee on a flight. Schwartz took up the case and used ChatGPT for his legal research. The AI chatbot returned six similar previous cases it claimed it had found and the lawyer included this into his filing. Everything was signed off by LoDuca, who technically was representing the client as he is admitted to the federal courts whereas Schwartz is not.
Unfortunately for the two lawyers, ChatGPT completely fabricated those six cases and the two attempted to argue their way out of admitting they wholly depended on an AI chatbot and did not lookover its claims.
As for that underlying case brought by their client against the airline, the judge tossed the case due to the statute of limitations expiring.
TopicsArtificial IntelligenceChatGPTOpenAI
Watch MTV's Video Music Awards 2016 livestream2025-07-02 08:49
荷蘭 :範加爾回歸重拾433傳統 未遇硬茬前景不明2025-07-02 08:35
高危職業?英超主帥下課指數:瓜帥渣叔無憂 索帥最特殊2025-07-02 08:18
專家會診三獅:切爾西兩翼扶正 薩卡芒特替斯特林 ?2025-07-02 07:31
U.S. government issues warning on McDonald's recalled wearable devices2025-07-02 07:13
多家西媒:格子將被追加停賽一場 無緣歐冠戰米蘭2025-07-02 07:06
西甲一周資訊 :阿爾維斯回歸巴薩 武磊隊友表現出色2025-07-02 07:04
朱辰傑逐漸坐穩後防主力 頭球優勢或成定位球奇兵2025-07-02 06:48
Watch MTV's Video Music Awards 2016 livestream2025-07-02 06:35
考驗智慧的時候到了!比國足更棘手的還有3個難題需足協定奪2025-07-02 06:30
Old lady swatting at a cat ends up in Photoshop battle2025-07-02 09:09
時間不等人!重慶隊已山窮水盡 股改承諾尚未兌現2025-07-02 08:57
央視記者:李鐵超長發布會是情緒釋放 晉級世界杯尚存理論希望2025-07-02 08:44
德國:弗裏克上任開啟碾壓模式 戰車仍需大賽考驗2025-07-02 08:39
Watch MTV's Video Music Awards 2016 livestream2025-07-02 08:23
中國足協將與李鐵進行深入溝通 帥位無憂大概率打完12強賽2025-07-02 08:15
不會全走!國足三名歸化隨隊回國 阿蘭暫離仍將征戰12強賽2025-07-02 07:56
朱辰傑逐漸坐穩後防主力 頭球優勢或成定位球奇兵2025-07-02 07:52
New Zealand designer's photo series celebrates the elegance of aging2025-07-02 07:27
拉波爾塔:不排除梅西小白回歸巴薩 未來皆有可能2025-07-02 06:39