One of the more brilliant Pixar-Disney creations is The Incredibles series, which is about a family with superpowers trying to restore the public’s trust in superheroes while balancing regular family life. The not-as-great sequel movie, Incredibles 2, still features Holly Hunter, which is good, and the Baby Jack Jack fight scene with a raccoon is classic. But the kind-of-a-stretch storyline is about conquering the villain called the “Screenslaver” who hypnotizes people via TV screens to do terrible things like assassinate the U.S. Ambassador (who is clearly supposed to be Madeleine Albright). In real life, the FDA is trying to prevent software from hypnotizing doctors and nurses into making bad medical decisions. This week, the FDA released new guidance that explained how it would be regulating some AI tools as medical devices, including devices to predict sepsis (sometimes called blood poisoning). Because sepsis is a quickly-evolving medical emergency that is difficult to diagnose, there is software that makes computer alarms go off when sepsis may be in play for patients. While this can be life-saving, the problem is that AI-enabled sepsis detection tools — which have not been regulated as medical devices so far — may lead to “alert fatigue” with doctors and nurses who ignore alarms when the alarms cry wolf too often. While we are sure that none of these software solutions will lead to ambassador assassination attempts, or even fights with raccoons, it may be a good idea for the FDA team to wear incredible costumes when they are deciding which clinical decision software to regulate.
September 30, 2022 | 2 min read
September 30, 2022
Maverick's Update
Only What Matters in Health Information Policy
REQUEST A DEMO
MyMaverick is a subscription service that provides access to analysis and news across the health technology policy landscape.
Sign UpSpeaking Engagements & Custom Services
Maverick offers a range of flexible services tailored to each client’s needs.
View Services