← Back to incidents

Epic Deterioration Index AI Failed to Predict COVID-19 Patient Deaths

High

Epic's widely-deployed AI early warning system failed to accurately predict COVID-19 patient deterioration, missing the majority of cases according to University of Michigan research.

Category
Medical Error
Industry
Healthcare
Status
Reported
Date Occurred
Mar 1, 2020
Date Reported
Jul 15, 2021
Jurisdiction
US
AI Provider
Other/Unknown
Model
Epic Deterioration Index
Application Type
embedded
Harm Type
physical
Human Review in Place
Yes
Litigation Filed
No
distribution_shiftearly_warning_systemscovid19epic_systemspatient_safetymodel_drift

Full Description

Epic Systems' Deterioration Index, an AI-powered early warning system deployed across hundreds of US hospitals, experienced significant performance failures during the COVID-19 pandemic. The system, designed to predict patient deterioration by analyzing electronic health record data including vital signs, lab values, and clinical notes, had been widely adopted by healthcare systems to help clinicians identify high-risk patients requiring immediate attention. A comprehensive study conducted by researchers at the University of Michigan and published in 2021 revealed critical flaws in the system's performance during the pandemic. The research team analyzed the Deterioration Index's predictions for COVID-19 patients at Michigan Medicine and found that the AI system missed the majority of patients who actually deteriorated. The study demonstrated that the model's sensitivity dropped significantly when applied to COVID-19 cases compared to its baseline performance on pre-pandemic patient populations. The fundamental issue stemmed from distribution shift - the AI model had been trained on historical hospital data that did not include COVID-19 patients. When the pandemic began, the clinical presentation, disease progression patterns, and deterioration indicators for COVID-19 patients differed substantially from the training data. Traditional early warning signs that the model relied upon did not apply to the novel coronavirus, which could cause rapid deterioration through unique pathways including cytokine storms and silent hypoxia. The performance degradation had serious implications for patient care, as hospitals relied on these AI alerts to prioritize nursing attention and clinical interventions. Missed alerts could delay critical care decisions, potentially contributing to worse patient outcomes during a period when healthcare systems were already overwhelmed. The incident highlighted broader vulnerabilities in deployed AI systems when faced with novel scenarios not represented in training data, raising questions about the robustness of AI tools in healthcare during emergencies.

Root Cause

The Epic Deterioration Index was trained on historical hospital data that did not include COVID-19 patients, creating a distribution shift problem where the model's predictions became unreliable when applied to a novel disease with different clinical presentations and deterioration patterns.

Mitigation Analysis

Robust model monitoring systems could have detected the performance degradation during the pandemic. Continuous retraining protocols with COVID-specific data and human oversight validation would have improved accuracy. Alert fatigue reduction and calibrated confidence thresholds could have made the system more reliable during distribution shifts.

Lessons Learned

This incident demonstrates the critical need for continuous monitoring of AI system performance in healthcare, especially during unprecedented events. It underscores the importance of having human oversight and alternative safety nets when AI systems encounter distribution shifts.