Chicago-based Smokeball, which primarily serves small and Mid-Law firms, published its annual State of Law survey Monday, finding that 53% of respondents at small and solo firms had adopted AI software, an increase from 27% last year.
Still, only 33% of respondents felt comfortable using AI-generated legal research, and 39% expressed unease about relying on AI outputs. Fifty-three percent of those surveyed also shared ethical concerns related to AI, though this figure is a slight decrease from last year's 56%.
"AI is no longer a buzzword in legal circles, but a competitive necessity," Hunter Steele, Smokeball's chief executive, said in a statement on Monday. "What's most encouraging is seeing small firms and solo practitioners leading this technological step forward. They're discovering that AI is amplifying their capabilities, allowing them to focus on the strategic work that requires their human expertise while technology handles administrative tasks."
The survey collected responses from 310 legal professionals over December. Respondents included partners, associates, paralegals, legal support staff and firm administrators representing firms with 50 or fewer employees.
While 54% believed AI can reduce legal service costs and 69% were willing to invest time in learning AI tools, 46% believed that lawyers should not charge the same rates when using AI, "signaling a need to reevaluate billing structures," according to the study.
Nearly half of the respondents also remained unsure about regulations. Twenty-seven percent believed that certification from state bar associations would help establish trust, while 43% stated that clear disclosure is essential for maintaining confidence in AI-assisted legal services.
"The data indicates legal research, document creation and e-discovery will be the three areas most significantly impacted by AI in the next one to five years," the study said.
Smokeball provides software for document automation, email management and analytics. Its billing software performs time tracking, collects and sends invoices, and helps comply with accounting regulations. The company offers its billing software to more than 20 U.S. bar associations, helping the company reach 500,000 lawyers in the country.
Several studies related to AI in law — some of them with conflicting results — have been published during the first quarter of the year.
A Suffolk University professor earlier this year posted a Social Science Research Network paper primarily produced by OpenAI's ChatGPT, which said generative artificial intelligence would redefine legal scholarship, while Magic Circle firm Linklaters LLP released a report last month recommending that users of OpenAI o1 and Google Gemini 2.0 avoid using the programs in English law advice without human supervision.
Earlier this month, professors at the University of Minnesota and University of Michigan published a study involving 127 law school students that concluded newer AI programs like OpenAI o1 and VLex's Vincent AI accelerated the completion of legal work and provided satisfactory, if not perfect, results in litigation-oriented tasks.
A Law360 Pulse survey, published March 4, found that more attorneys seem to be using generative AI tools and view them positively compared with last year, but lawyers are still concerned about legal ethics and client confidentiality when it comes to the technology.
Attorneys are also still being sanctioned for improper use of the technology. In February, a Wyoming federal judge sanctioned attorneys from Morgan & Morgan PA and the Goody Law Group after they filed pretrial motions containing case law hallucinated by AI, while a federal magistrate judge in Indiana recommended sanctions against a Texas lawyer for filing three separate briefs that included fake citations generated by AI.
--Editing by Adam LoBelia.
Social Science Research Network is owned by Elsevier, a division of RELX Group, which also owns Law360.
For a reprint of this article, please contact reprints@law360.com.