3 Factors Attorneys Should Keep In Mind When Using AI

This article has been saved to your Favorites!
Attorneys who hope to leverage new artificial intelligence programs in their legal work should stay mindful of three rules of conduct from the American Bar Association dictating attorney competence, client confidentiality and billing procedures, according to a recently published paper from the International Association of Defense Counsel.

The report, titled "Emerging Artificial Intelligence Risk Management Considerations for Law Firms," surveys law firm risk management considerations when using AI tools in common areas, such as model rules dictating competence, confidentiality and billing. Specifically, the report addresses Model Rule 1.1, 1.5 and 1.6 under the American Bar Association's Model Rules of Professional Conduct, with a focus on how law firms can address these issues.

Mark J. Fucile, a partner at Portland-based Fucile & Reising LLP, authored the report.

Overall, the paper assures attorneys that, like earlier waves of law practice technology, AI tools can likely be used safely by following practical steps developed during those earlier eras of significant technological change. This involves evaluating tools before use, setting clear policies and training lawyers and staff on the ins-and-outs.

The paper was published alongside others on Monday in the Defense Counsel Journal, a free resource provided by the International Association of Defense Counsel, or IADC, to its members, trial attorneys, in-house counsel and the judiciary. Other topics include U.S. federal product safety regulation, Article III standing to appeal in federal court and the concept of "safetyism" among jurors.

Founded in 1920, the IADC boasts around 2,500 members across 47 countries, including all 50 states in the U.S.

Regarding the rules surrounding attorney "competence," which, per the ABA, requires knowledge, skill, thoroughness and preparation reasonably necessary for representation, Fucile suggested firms should evaluate all AI tools before use, set clear boundaries and train staff on the benefits and risks associated with the technology.

The paper noted the case Mata v. Avianca, Inc. , in which an experienced personal injury attorney used ChatGPT to help draft briefs and arguments on the Montreal Convention governing claims arising on international flights — a topic the lawyer was unfamiliar with. The filings were not vetted by the attorney, and many of its citations were hallucinated — when AI generates inaccurate or fabricated responses to queries — by the free AI program. The attorney, along with his law firm, were sanctioned.

According to Fucile, the case appears as a "technology-enabled version of the apocryphal lawyer of an earlier generation who read the headnotes in a paper reporter, but not the cases themselves, before including them in a brief."

The use of a "free" version of ChatGPT also called into question oversight taken on confidentiality of client information by the attorney. The company warns that information shared is not treated as confidential, and the lawyer in Mata entered increasingly specific search prompts that could have revealed sensitive information.

"Accuracy aside (which remains the duty of the lawyer to confirm), there is nothing necessarily wrong with using a consumer-oriented product like the 'free' version of Chat GPT to research general information, just as there is nothing inherently wrong with using Google for the same purpose," the paper said. "The line of demarcation … is that a lawyer should not share client confidential information with a non-confidential technological medium."

The article also addressed questions on using AI to more effectively bill clients, particularly two points. Fucile acknowledges that attorneys may, under the ABA rules, opt to itemize the expense of AI tools rather than incorporate those costs into overhead that is reflected in the firm's hourly rates, so long as the client agrees in advance to this billing format.

Fucile also said that lawyers may generally charge for the time spent using more time-efficient AI tools to accomplish underlying tasks.

The article noted that, as an example, "a lawyer would ordinarily be able to bill for the time creating research prompts for a given case in the same way the lawyer bills for doing other legal research."

"In fact, lawyers are expected to use reasonably available means that save clients' money," the article said, adding that disregarding the efficiencies brought on by advancing technologies may be seen as unreasonable under ABA rules, but attorneys are generally barred from charging for the theoretical "time saved" by using AI tools, as that time was never actually incurred by the attorney.

--Editing by Nicole Bleier.


For a reprint of this article, please contact reprints@law360.com.

×

Law360

Law360 Law360 UK Law360 Tax Authority Law360 Employment Authority Law360 Insurance Authority Law360 Real Estate Authority Law360 Healthcare Authority Law360 Bankruptcy Authority

Rankings

NEWLeaderboard Analytics Social Impact Leaders Prestige Leaders Pulse Leaderboard Women in Law Report Law360 400 Diversity Snapshot Rising Stars Summer Associates

National Sections

Modern Lawyer Courts Daily Litigation In-House Mid-Law Legal Tech Small Law Insights

Regional Sections

California Pulse Connecticut Pulse DC Pulse Delaware Pulse Florida Pulse Georgia Pulse New Jersey Pulse New York Pulse Pennsylvania Pulse Texas Pulse

Site Menu

Subscribe Advanced Search About Contact