📢 Gate Square Exclusive: #PUBLIC Creative Contest# Is Now Live!
Join Gate Launchpool Round 297 — PublicAI (PUBLIC) and share your post on Gate Square for a chance to win from a 4,000 $PUBLIC prize pool
🎨 Event Period
Aug 18, 2025, 10:00 – Aug 22, 2025, 16:00 (UTC)
📌 How to Participate
Post original content on Gate Square related to PublicAI (PUBLIC) or the ongoing Launchpool event
Content must be at least 100 words (analysis, tutorials, creative graphics, reviews, etc.)
Add hashtag: #PUBLIC Creative Contest#
Include screenshots of your Launchpool participation (e.g., staking record, reward
AI is risky, use it with caution! US lawyer punished for citing fake ChatGPT case
***Source: *Financial Association
Generative artificial intelligence like ChatGPT can greatly improve human office efficiency, but its ability to "fabricate" cannot be ignored. If there is a mistake in normal use, it may be just a joke, such as "Lin Daiyu pulls out the weeping willow upside down, Grandma Liu beats Jiang's door god drunk", but if there is a mistake in a key area or moment, the result may be disastrous.
On Thursday local time, a New York federal judge ruled that Levidow, Levidow & Oberman, a law firm, cited a court brief written by ChatGPT that was cited by a false case as a bad behavior and fined it $5,000.
Judge P. Kevin Castel noted that two attorneys at the firm consciously avoided showing that the citations were false and "made misleading statements to the court."
Levidow, Levidow & Oberman subsequently responded that its lawyers "humbly" disagreed with the malicious misleading judgment. "We made a well-intentioned mistake. We didn't expect that ChatGPT could be fabricated out of thin air."
Fabrication
In March, attorneys Peter LoDuca and Steven Schwartz filed a lawsuit against Avianca on behalf of their client, Roberto Mata. A legal document written by artificial intelligence.
It is reported that the bogus cases cited include "Ghese v. China Southern Airlines", "Martinez v. Delta Air Lines" and "Miller v. United Airlines", among others. However, after careful investigation, these judgment cases could not be found at all.
In the face of the facts, by the end of May, attorney Schwartz had to admit that he used ChatGPT to "supplement" his research on the case. Schwartz said he was unaware that ChatGPT's content could be fake.
Custer said the two lawyers "abandoned their responsibilities" and that they "continued to insist on false opinions" after they challenged the paperwork.
Custer ordered Loduka and Schwartz, as well as their law firm Levidow, Levidow & Oberman, to each pay a $5,000 fine and to themselves notify judges of the verdict that it believed the case had been falsified.
"The court will not ask them to apologize because a coerced apology is not a sincere apology," Judge Custer wrote in the judgment in Manhattan federal district court. "The decision to make any apology rests with the parties involved."
There is nothing wrong with using artificial intelligence
In a separate order on Thursday, the judge granted Avianca's motion to dismiss the lawsuit. The plaintiff, Mata, claims he was seriously injured when his knee was hit by a metal pallet on a flight from El Salvador to New York in August 2019.
Castells argued that Mata's lawsuit was brought after the expiration of the two-year litigation window stipulated in the Montreal Convention, which allows for legal claims related to international air travel.
Castells said lawyers might not be punished if they "confessed" about using AI to write legal documents.
However, after questioning the document, the two lawyers still vigorously defended themselves, making false and misleading statements about the document and its contents, showing "maliciousness".
"In researching and drafting court papers, good lawyers will, where appropriate, seek help from databases such as junior lawyers, law students, contract lawyers, legal encyclopedias, Westlaw, and LexisNexis," the judgment reads.
Castor also wrote: "Technological advances are commonplace, and there is nothing inherently wrong with using reliable artificial intelligence tools to help. But existing rules impose gatekeepers on lawyers to ensure the accuracy of the paperwork they file." .”