A new analysis from the Duke-Margolis Center for Health Policy details the data elements that real-world data (RWD) sources need when evaluating the performance of artificial intelligence (AI)-supported clinical decision support (CDS) tools.
Non-standardised electronic health record (EHR) systems and continually changing clinical workflows are major impediments to evaluating their performance. RWD can provide performance measures to assess the clinical performance of these AI-supported tools and address issues related to data access, data sharing, privacy, and security.
The U.S. Food and Drug Administration (FDA) is interested in using RWD in the post-market evaluation of AI-supported CDS tools in healthcare, which are often regulated as software medical devices. The FDA divides AI into two categories depending on how it is constructed.
- Rules-based AIs use previously validated information to determine clinically accepted weights or decision steps that support a prediction, diagnosis, or recommendation.
- Data-based AIs use training data sets and programmed processes to derive relationships and then categorise new data. This forms the basis of the support guiding predictions, diagnoses, and recommendations. One advantage these AIs have is that they can continue to learn from new data to fine-tune derived relationships.
The analysis identified the following data elements needed to evaluate AI-supported CDS performance:
- Model output includes the risk score, diagnosis, or a suggested action. Intended use would guide the construction of the data elements and likely be located in an EHR data system.
- Comparison to output is the observed outcome, such as re hospitalisation or death. Intended use would guide the construction of the data elements and likely be located in an EHR data system, claims data, or a registry.
- Operational data could include medication administration, procedures, or patient actions. Intended use would guide the construction of the data elements and likely be located in an EHR data system, claims data, or a registry.
- Demographic subgroup analysis variables could include age, sex and race, insurance status, and medical history. These could originate from EHRs, claims data, registries, or internal device data.
Additionally, RWD could capture whether the health care providers acted on the algorithm’s recommendations or if other factors confounded the algorithm performance like early action or extra treatments.
One challenge in using RWD to access performance is that the necessary representative data may be unavailable. In particular, data may have to be linked across different siloed databases, which requires de-identification. The creation of centralised data repositories is one solution to this issue. The analysis authors recommend that collecting such data be addressed at a policy level. Sharing of patient data bring privacy concerns that can be addressed through legislation. Regardless, cloud computing and storage platforms need to remain secure.
Ensuring data quality is another challenge. This arises from the following conditions:
- Inconsistent and inefficient methods for capturing relevant data
- Lacking data on patient health status and long-term outcomes
- Lacking data on how physicians use (or do not use) algorithms’ output
- Lacking data across the continuum of a patient’s care
- Systemic biases within data sources that confound locating representative data
- Inconsistent data elements or definitions for data elements across sites of data collection
The authors of the analysis make the following recommendations:
- AI-supported CDS tools should be monitored for potential performance changes overtime.
- Specific data elements that should be captured within RWD sources include:
o Algorithm inputs and outputs
o Gold-standard comparators
o Patient outcomes
o Details on whether the healthcare professional acted on the algorithm’s recommendations
- Data collection parties must resolve how to exchange data securely and how to govern data sharing and use from a legal perspective.
Click herefor the latest Decision Support news