11th Circuit Opinion determined by A.I

11th Circuit Opinion determined by A.I

An insurance company issued a policy to a landscaper and the landscaper then decided to install an in-ground trampoline. The landscaper was then sued by a third party for a negligence claim because of the in-ground trampoline. When the landscaper decided to file a claim with the insurance company, the insurance company refused to cover this claim. The question appeal was this; Whether or not installing a trampoline counts as “landscaping” based on the policy.

In the concurring opinion, Justice Kevin Newsom, Judge for the Eleventh Circuit, wrote that he did something which many will, “Condemn as heresy.” Justice Newsome described how he used Artificial Intelligence (A.I.), ChatGPT and other large language model (LLM) products to help him come to his decision. Since the question on appeal was whether installing a trampoline counted as “landscaping”, Justice Newsom was trying to determine the definition of “landscaping”. He first set out to find the answer to his question by doing traditional legal research. Justice Newsome was looking through cases and cases in hope that there might be one case which sets out the definition of “landscaping” in a very complex and hard way to understand. It was in the middle of this research that he had the profound idea that maybe, just maybe, “ChatGPT might be able to shed some light on what the term ‘Landscaping’ means”.

Justice Newsom then did the unthinkable, he asked ChatGPT, “Is installing an in-ground trampoline ‘Landscaping?’” ChatGPT then responded, “Yes, installing an in-ground trampoline can be considered a part of landscaping. Landscaping involves altering the visible features of an outdoor area for aesthetic or practical purposes, and adding an in-ground trampoline would modify the appearance and function of the space. It is a deliberate change to the outdoor environment, often aimed at enhancing the overall landscape and usability of the area.”

What Justice Newsome was tasked with, he tasked A.I. to do for him. While this might have been an absurd thing for a Justice to do, Newsom explained that asking A.I., as a judge who is a textualist and “plain language guy”, might be beneficial because it can be used as an aid alongside using dictionaries or a thesaurus. He understands that A.I. can have a mind of its own and hallucinate cases. However, Justice Newsom also understands that A.I. learns from computer data which gives them the ability, “to accurately predict how normal people use language in their everyday lives.” A.I. was created for instances as this, to give definitions in plain language that is easy for any judge, jury or lawyer to understand. While A.I. might not be replacing judges and we will never have to present our case before a robot, this case shows us just how useful A.I. can be when used for the right things.

 

Share This Story, Choose Your Platform!