The lawyer sued the airline, but the case crashed with a robot

The lawyer sued the airline, but the case crashed with a robot

An American lawyer gets into trouble after letting a robot do the job. The judge soon realized that it was all a fabrication.

The airline Avianca was sued, but their lawyers had an easy win at first.

On a flight, Roberto Matta was injured when one of the flight attendants’ carriages struck his knee. Mata hired lawyers to obtain compensation.

The airline, Avianca, said the case did not belong in the legal system. Judge Kevin Castle asked Mata’s attorney to explain why a full hearing of the case should be held.

Very quickly, attorney Stephen Schwartz returned with a ten-page document. She cited cases such as Martinez v. Delta Airlines and Varghese v. China Southern Airlines and many other issues. In this way, he wanted to show that there is a strong precedent for the court to deal with such cases.

New York Judge Kevin Castle wasn’t very impressed with the court papers he received.

Fair nonsense

The airline’s lawyers pounced on the document. The judge also started to look at them.

They quickly discovered that something wasn’t right. None of the cases referred to by Schwartz could be found. They couldn’t even google other jurists’ quotes.

Schwartz had to appear in court to explain himself. according to New York times He has more than 30 years of experience as a lawyer. He immediately put all his cards on the table.

For the first time in his life, he claims, he asked the ChatGPT bot to use its artificial intelligence to find arguments. The result seemed convincing. There were clear accounts of earlier cases, with references to sources and footnotes. But everything turned out to be a free fantasy.

ChatGPT is just the beginning. Experts predict that accuracy will improve a lot over time.

Ask for control

CNN He writes that Schwartz “confirmed the quality” of the information by asking the bot if it was sure the information was correct. When he got an answer in the affirmative, he agreed.

The bot even apologized that the first answer might sound a bit confusing.

The newspaper online above the law, who addresses lawyers and other jurists, and also writes about the case. The newspaper aggressively distances itself from commenters who blame ChatGPT. The journalist believes that rather than being an example of artificial intelligence not measuring up, it is a story about a lawyer who took shortcuts.

He passed the exam

Written in January Reuters About a survey in which law professors mark exam papers. Some of the answers are written by ChatGPT.

While the real students got a B+ grade on average, the robot had to settle for a C+.

– ChatGPT researcher Jonathan Choi concluded, on his own, that he was a fairly mediocre student. He added that the bot could still be useful to many lawyers. They can use AI to write first drafts when preparing cases.

Attorney Stephen Schwartz opined that a robot’s work should be seen as a draft, and that everything should be quality-assured.

Mixing facts

blog Scottosblog Covers the Supreme Court of the United States. Journalists there asked ChatGPT 50 questions about the country’s highest court. It didn’t go well.

Only 21 of the answers were completely correct. The quality of the wood was questionable. 26 was an outright mistake.

Mixed facts about robotics and pure manufacturing. For example, conservative judge Clarence Thomas has been portrayed as very gay-friendly. His vote in court shows just the opposite.

Testing has also shown that ChatGPT can give completely different answers to the same question, sometimes correct and sometimes just plain nonsense.

See also  Amnesty International, Amnesty International | Six ways you can get started with AI – without being a techie
Hanisi Anenih

Hanisi Anenih

"Web specialist. Lifelong zombie maven. Coffee ninja. Hipster-friendly analyst."

Leave a Reply

Your email address will not be published. Required fields are marked *