HealthDecember 08, 2025

Clinical experts-in-the-loop are essential for AI trust

AI solutions require a human-in-the-loop for trust and reliability. Clinical GenAI tools require clinical experts to be in the loop when patient safety decisions are at stake.

Across industries, keeping humans continually in the AI loop is essential for maintaining ethics, information accuracy, and customer trust. Within healthcare, as clinical decision support (CDS) solutions integrate generative AI (GenAI) technologies, it’s imperative that these humans-in-the-loop have clinical expertise to help support safe care delivery.

Clinician experts are on the front lines using the technology—asking care questions and medication treatments—and exercising critical thinking and clinical judgment, while listening to the patients and their experiences. But with CDS, there should also be clinical experts behind the technology and the content generation throughout development.

Ultimately, clinicians need to be able to trust a generative response, including where the tool sources its information and evidence, and who stands behind it. A Wolters Kluwer survey showed 91% of physicians would feel more comfortable using GenAI tools with source content created by doctors and medical experts, and 86% expected vendor transparency regarding the information sources and creation process.

This expert-in-the-loop process was crucial to developing and maintaining UpToDate Expert AI, the new GenAI tool that draws only from UpToDate proprietary content authored and reviewed by practicing clinicians.

“We review UpToDate content regularly,” says Amanda Heidemann, MD, FAAFP, FAMIA, Senior Clinical Consultant for Wolters Kluwer Health. “We know exactly when each topic was reviewed, who touched it, who edited it, and where that information came from. And when you put AI on top of that, it becomes a really powerful tool.”

Screenshot of Amanda Heidemann's Healthcare AI perspectives from UpToDate: Evidence at the center of AI video clip
Dr. Amanda Heidemann discusses the importance of keeping clinical evidence at the center of AI tools.

Clinical AI requires experts to be the humans-in-the-loop

For the UpToDate Expert AI generative responses, it was crucial that humans—specifically, experts in clinical practice—were kept in the loop. It was developed alongside the clinical community and customers supported early product development through testing and feedback. Now launched, clinical experts are involved in the important clinical intelligence foundation, platform testing, and solution usability and feedback.

Clinical intelligence

Expert content is the foundation of trusted decision support. Over 7,500 practicing clinicians and experts review the latest industry standards and journal research for clinical insights and integrate them into UpToDate content and care recommendations. It’s critical that humans review content to address any potential difficulty, bias, or harm that could come from a single study outside of a greater clinical context.

Internal platform testing

Within the UpToDate Expert AI tool, clinical experts are involved in reviewing generative responses for consistency, reliability, and clinical accuracy. Checkpoints include whether the system should or should not have answered, identifying hallucinations or incorrect model patterns, and more.

Solution usability and feedback

As the GenAI solution is used in the care setting, clinical experts regularly assess feedback for continuous improvement. This feedback can come in the form of user scenario testing, interviews, and platform feedback and comments.

Download Infographic

Transparent AI development can help build trust

These regular expert-in-the-loop checks are critical to supporting safe care delivery and decision-making, as well as shining a light into the black box of AI. With many AI tools, users aren’t aware of how the solution provides its answers and which sources it’s referencing. Without that transparency and library of content, there can be an increased risk of hallucinations.

“We knew we had to have it fully trusted,” says Holly Urban, MD, MBA, VP Business Development-Strategy for Wolters Kluwer Health. “So we built a solution that is 100% transparent in terms of what content it's looking at. It's looking at only UpToDate content, and you can see exactly within the tool itself, every single piece of content that it looked at to formulate the answer.”

GenAI in clinical decision support requires deep scrutiny and trust with patient care on the line. Having human clinical experts-in-the-loop, evidence-based content validation, and transparent sources, clinicians can feel confident in the new era of GenAI.

Learn About UpToDate Expert AI
Screenshot of Holly Urban's Healthcare AI perspectives from UpToDate: Transparency in clinical AI video clip
Dr. Holly Urban emphasizes the importance of source transparency with clinical AI tools.
Back To Top