EDITOR’S NOTE: To answer frequently asked legal questions, EastIdahoNews.com is collaborating with local lawyers to provide you with information. Every Sunday afternoon, these weekly columns will be published. If you have any questions for the lawyers, send an email to [email protected].
Two lawyers from New York were disciplined by a U.S. District Court judge in June 2023 for submitting a legal brief that referenced six made-up cases created by ChatGPT. Then, in November 2023, a Colorado lawyer who used ChatGPT to create court petitions with fictitious cases was barred from practicing law for a year. A man from South Carolina was recently charged with utilizing artificial intelligence (AI) to compose his court filings while posing as an attorney.
These are only a few of the numerous situations in which applying artificial intelligence (AI) in the legal field has caused problems for both lawyers and non-attorneys. AI in the legal field isn’t going away anytime soon, despite this. If anything, the future seems to lie in employing AI to help with legal research or drafting briefs.
Case Research
Books and legal treatises have historically been used by attorneys to undertake legal research. The introduction of the internet completely changed this practice. Even yet, law school courses on how to do appropriate boolean searches to locate pertinent case law were required of attorneys. AI has further transformed this process, just like the internet did.
AI case search options are provided by LexisNexis and Westlaw. The lawyer must ask the AI questions rather than crafting an ideal boolean search string. After that, the AI will produce a report that addresses the query and includes case citations. AI can therefore uncover examples that could have gone unnoticed because of inadequate search criteria and obtain answers more quickly than traditional search techniques.
AI can generate inaccurate or even fraudulent information, even though it is faster. The AI can generate case law out of thin air, a phenomenon known as hallucinations. Even worse, when AI responds to your query, it frequently cites irrelevant and unrelated sites.
The lawyers were in hot water in Colorado and New York as a result of these AI hallucinations. These lawyers would have learned that some cases did not exist if they had taken the time to go over the data that the AI had given them. The court disciplined them for their failure to comply.
In my work, I have come across a number of AI hallucinations. In one situation, the AI informed me that my case was subject to a certain statute of limitations. This caught me off guard because I had never heard of the statute that the AI claimed as authority on this matter. I learned that the AI had taken a statute from a whole separate body of law that only applied in certain situations after looking more closely at its citations. Opposing counsel would have had a field day dissecting my briefing if I hadn’t double-checked the AI’s assessment and referenced the statute to back up my claims.
And that is where the current issue with AI rests. Indeed, artificial intelligence has increased the speed and precision of legal research. To guard against errors and hallucinations, a lawyer should always confirm the AI’s conclusions.
Drafting
Large language models, or LLMs, are the foundation of all AI. Large volumes of data found in LLMs are used to train the AI and provide it with knowledge. For instance, ChatGPT’s LLM was trained using more than 175 billion data parameters. However, LLMs don’t have to be expansive and comprehensive. Rather, LLMs may only consist of one or a few documents. Attorneys can examine information and create papers more effectively with the use of these mini LLM sets.
An attorney may, for instance, upload a single contract and ask the AI to provide a synopsis or respond to particular queries. AI can even save the lawyer (and the client) hours of labor by providing time stamps and descriptions of recordings, including police body camera footage.
When drafting, a lawyer can produce a new document with particular, distinct parameters by drawing on his complete library of existing papers. The newly produced document is rarely complete, of course. Because of this, the AI is more effective when utilized as an outline tool that draws from an attorney’s current library, which boosts productivity.
AI is a tool, not a replacement.
AI works best as a tool for lawyers right now, not as a substitute for them. Lawyers will still be required to represent their clients and manage all the many human aspects of the law, such as equity and fairness, even after AI advances and the LLMs that go along with it are improved.
W. Forrest Fischer practices law in Driggs at the Moulton Law Office. You can contact him via email at [email protected] or by phone at (208) 354-2345.
The information in this column is general and should not be construed as legal advice. Readers should speak with an attorney if they have specific legal questions. You can contact the Idaho State Bar Association’s lawyer referral service by calling (208) 334-4500.