Gesundheit01 April, 2019

AI's ability to scale data standardization and analytics

Applying artificial intelligence (AI) to the process of mapping disparate data to a common language empowers health care systems to scale their analytics efforts and unleash the power buried in mountains of data they continue to accumulate across their operations. The resulting insights are likely to be game changers for realizing health care’s Quadruple Aim of improving individual and population health, while lowering costs and increasing clinician satisfaction.

The challenge: Mapping disparate clinical data

For more than a decade, health systems have invested heavily in digitizing every aspect of their operations. Though some have realized significant returns, for many, the gains have not yet lived up to the promise or the level of their investments.

That’s because digitization alone is not enough. Standardizing data across systems is the key to applying analytics and gleaning insights, but disparate systems – exacerbated by health industry consolidation – have complicated standardization efforts. According to a May 2018 article in Healthcare IT News, the typical health network engages with as many as 18 different electronic medical record (EMR) vendors.

The problem becomes clear when one considers for example a common lab test used to identify and quantify leukocytes in cerebral spinal fluid. Clinicians, labs and billing systems identify the test in dozens of different ways across multiple systems, leading to semantic interoperability challenges that undermine the reliability of any analysis of how the test is used or its efficacy and appropriateness for certain conditions. In a paper published in the Journal of the American Medical Informatics Association, after analyzing 21 EHRs, researchers found 615 observation errors and data variations in the CDA (Clinical Document Architecture) which is used to share patient information between systems.

Of course, this problem is not new and standards do exist to reliably normalize and exchange clinical health information. LOINC – which applies names and identifiers to more than 58,000 medical terms that might appear in an EHR – is one such standard. It allows organizations to identify the multiple terms for a test or procedure and tag every instance with a common LOINC tag. And when health systems can justify the effort and investment of having data experts – aided by emerging rules and automation – engage in the laborious process of doing this, they often reap rewards.

Until now, however, they have not been able to significantly speed and scale the process.

The solution: Automating mapping to standards at scale

The maturation of AI is the route to efficient scalability. We are now able to build Machine Learning (ML) models that can reliably recognize slightly different terms that mean the same thing and map them to a standard like LOINC.

The concept of ML, of course, means the learning builds on itself and is applicable to increasingly large data sets. For example, Wolters Kluwers’ Sentri7 surveillance solution uses AI to automatically normalize lab data to industry standard terminologies like LOINC. In one instance, this has resulted in more than 300,000 mapped terms to support clinical surveillance of life-threatening infections and other medical conditions.

Automating mapping to standards at scale eliminates much of the data ambiguity across disparate systems and even facilitates new understanding of previously un-coded and non-standard data. Clinician confidence increases because they don’t have to worry about misrepresentation of patients within Clinical Quality Measures, inaccurate real-time care alerts or unreliable decision support rules for drug interactions and doses, allergies and medical history.

In short, much more rapidly and inexpensively than the way standardization of data has occurred to date, AI gives health systems a tool to make sense of all of their data across their organization. They can analyze patient needs, population health and provider performance across key quality and cost measures to make more informed decisions for nearly every aspect of care delivery.

Health Language Clinical Natural Language Processing
Automate the review of unstructured data, extract clinically relevant data, and codify extracted data to industry standards.