LegalMarch 30, 2026

Legal experts discuss scaling AI in first webinar on Wolters Kluwer 2026 Future Ready Lawyer Survey

By: Anne H. Gibson, J.D., LL.M.

Artificial intelligence is having a significant impact on the legal landscape, including on revenue, workflows, and talent and hiring. Nearly all legal professionals are now using at least one AI tool on a daily basis and most report time savings, according to the Wolters Kluwer 2026 Future Ready Lawyer (FRL) Survey Report, which was released on March 10. The survey, to which respondents from the United States, China, and nine European countries contributed, reveals the complex ways that lawyers, law firms, and legal departments are adopting, and adapting to, AI tools and how such tools are changing the very structure of the legal market.

Grégoire Miot, Director of Product Management for Legal & Regulatory at Wolters Kluwer, and President of the European Legal Technology Association (ELTA), moderated the webinar, “Scaling AI Across Organizations,” the first in a planned series of webinars highlighting the survey’s findings. The panel included Elgar Weijtmans, Head of Technology, HVG Law; Ken Crutchfield, CEO, Spring Forward; Anne Graue, President & Co-Founder of "Our Legal Community"; Tomasz Zalewski, founder of LegalTech Poland Foundation; and Tom Braegelmann, LL.M., Annerton.

key findings of the 2026 FRL Survey

Scaling AI – From pilots to widespread adoption

In his opening remarks, Miot pointed out that this seventh edition of Wolters Kluwer’s Future Ready Lawyer report marked a turning point in that lawyers were no longer considering whether to incorporate AI in their practices, they were adapting to it. More than 90% of the survey respondents reported using at least one AI tool on a daily basis in their work.

Turning to the panelists, Miot asked how firms or legal departments can continue to move from AI pilot programs to more general adoption. Weijtmans suggested that initially firms might have rolled out a single, general AI tool broadly. In part, this might have been because choosing specific tools deliberately can be difficult in terms of time and effort. However, now, he suggested, the key step will be to ask, “Which problem are we actually solving?” Additionally, once the appropriate tool is chosen, the technology – the AI tool itself – is only 20% of the solution, while the other 80% is training the individuals who will be using the tool, so that they can use it effectively.

Crutchfield noted that he sees adoption varying by segment – large law firms, small firms, in-house counsel – and that solutions are best chosen carefully to meet the particular needs of the attorneys who will be using them. He also echoed Weijtmans’ sentiment, noting that in order to use existing AI tools efficiently and to achieve quality results, user training is a necessity. He felt that this is where the bottleneck in scaling adoption will be.

Braegelmann pointed out that AI has the chance to greatly reduce the amount of “drudgery” in legal work – such as copying items from a PDF to an Excel sheet or entering items into a legal database. He pointed out that technology innovations in the past – such as word processing, email, and cloud storage – were all quickly adopted by the legal industry, and this is also the case with AI tools. “Lawyers are innovation leaders in this field.”

Lawyers are innovation leaders in [AI adoption].

Governance and trust

The group next turned to the issue of governance and trust. 46% of the FRL Survey respondents cited data privacy and cyber threats as top security concerns. The panelists agreed that these issues have been and should continue to be broad concerns of lawyers, whether related to AI or not. They pointed out that existing stringent protections for data handling, both voluntary and obligatory, already provide good frameworks. These should be followed in all cases, including when using AI tools.

One issue the panelists noted was that data privacy violations are more likely to occur with the use of “shadow AI” – when a lawyer uses an unapproved tool that is publicly accessible or uses a tool provided by the firm in an unapproved way. Zalewski noted that some clients are asking firms not to use AI with their data, because of a well-founded suspicion that some lawyers are using publicly available AI tools. He suggested that a solution is for the firms to provide good quality, safeguarded AI tools for their lawyers, and clients can then be informed of the data privacy protections that are in place.

Braegelmann noted that the quote, “As soon as it works, no one calls it AI anymore,” is applicable here. Many common, existing workplace tools are types of AI, from spell-check to online searches, and they are now readily accepted. Lawyers should simply be as careful in using AI tools as they are with client data in all other cases. Young attorneys want to use tools to help them do their best work, and clients appreciate it when a task can be completed in two billable hours rather than fifty. If firms enable their attorneys to use quality tools that comply with data privacy laws, everyone will benefit.

Operational and workflow changes

The panelists engaged in a lively discussion about how AI may change workplace culture and processes in the legal field. The FRL Survey found that more than half of participants believed that certain tasks impacted by AI – legal research, document review, and document drafting, in particular – will be increasingly outsourced to alternative legal service providers (ALSPs).

The panelists were largely in agreement with this assessment and discussed the important distinction between tasks that lawyers must complete – legal advice, or “core” legal work – and tasks that do not require a lawyer – providing legal information, or “context” work – that can be outsourced to ALSPs. However, Crutchfield made an important observation – these entities should at this point be referred to as “LSPs” because they are no longer “alternative,” but rather the standard choice for many tasks.

Regardless of the power of AI, there will remain a core set of tasks that will require a human lawyer to complete. However, many of the more rote or time-consuming tasks can be outsourced to LSPs.

Regulatory readiness

The FRL Survey found that 81% of legal professionals felt that developing regulations governing the use of AI will be highly impactful. The approaches to AI regulation are currently quite different across different regions, such as in the United States, the European Union, and China. Miot asked the panel what it thought about the potential jurisdictional differences that might arise as regulations are implemented.

Zalewski noted that a key issue in this area is confidentiality and how it interacts with cross-border data transfers. This, Zalewski said, is similar in many ways to issues that arose when cloud-based data storage methods were first adopted. He stated that, currently, most AI vendors are located in the United States, outside of the European Union and the coverage of the General Data Protection Regulation (GDPR). A concern for entities in Europe is that they must count on the vendor to maintain protections for client data that meet the requirements of the GDPR. This is causing many firms to look for local models and solutions.

Lawyers are no longer considering whether to adopt AI – they are adapting to it.

Operating models and the billable hour

AI has the power to alter the way people work and also the way companies do business. 52% of respondents reported that their organizations have seen revenue growth after adopting AI, and 62% expect AI-driven efficiencies to reduce billable hours. Miot inquired of the panel how AI might change the way lawyers work and the reliance on the billable hour. Crutchfield noted a trend in outside counsel guidelines in which many clients state that they will not pay for certain things, including the use of AI, driving up billable hour rates.

Zalewski noted that one solution is to use set fees. An attorney could start from his or her billable hour rate, estimate the time and effort a task will take, and then translate that into a set fee for the client that incorporates the attorney’s overhead. That gives clients the predictability they are looking for while also keeping the price adequate to cover the tools the attorney uses. He suspects this may become a more common billing method moving forward.

Looking forward – Training as the key to success in scaling AI

Miot concluded the discussion by asking the panelists what they thought people get wrong about AI. Graue suggested that a misunderstood point is that AI is not about the tools, but rather about change management, adapting workflows, and adopting AI in the daily habits of the workforce. Several other panelists echoed this sentiment. As Zalewski noted, people may want to wait for the “best” AI model, but the reality is that better training of the human users vastly improves the outputs of AI tools and their contributions to efficiency. Rather than waiting for the best tool, companies should train their employees now to get the best outcomes and the widest adoption.

2026 Future Ready Lawyer Webinar Series

Anne H. Gibson, J.D., LL.M.
Senior Legal Analyst
Anne H. Gibson is a Senior Legal Analyst for Wolters Kluwer Legal & Regulatory U.S. She has worked in the tax field since 2010, with a focus on personal income tax and international reporting obligations, estate and gift tax, and estate planning. She has private practice experience in the areas of trust and estate planning and compliance with U.S. tax obligations and has taught on such topics as income taxation of trusts and estates, estate planning, and more.
Back To Top