Would it surprise you if I told you that a popular and well-respected machine learning algorithm developed to predict the onset of sepsis has shown some evidence of racial bias?[1] How can that be, you might ask, for an algorithm that is simply grounded in biology and medical data? I’ll tell you, but I’m not going to focus on one particular algorithm. Instead, I will use this opportunity to talk about the dozens and dozens of sepsis algorithms out there. And frankly, because the design of these algorithms mimics many other clinical algorithms, these comments will be applicable to clinical algorithms generally.
Blog Editors
Recent Updates
- In Confirmation Hearings, AG Nominee Pledges to Defend the Constitutionality of the False Claims Act
- A Primer on Executive Orders and a Preview of the Road Ahead
- At Long Last, DEA’s Remote Prescribing Rules 2.0 Are (Really) Here! (Pending Further Consideration by the Incoming Administration . . .)
- Massachusetts District Court Applies “But-For Causation” Standard, Dismisses AKS-Based FCA Case After Evaluating Facts and Circumstances of Independent Contractor Arrangements
- DOJ’s False Claims Act Recoveries Top $2.9 Billion in FY 2024, but Health Care Numbers Dip—What Could FY 2025 Hold for Health Care Enforcement?