Over 62 percent of American adults use an AI voice assistant like Siri or Alexa in their everyday lives. Statistically speaking, some of those roughly 160.7 million individuals will probably encounter a person suffering a health emergency in the near future. And while asking Siri how to properly perform CPR may not be the first thought in such a stressful scenario, it hypothetically could open up an entirely new area for AI assistance. Unfortunately, new research indicates these products aren’t equipped to help out in life-threatening situations—at least, for now.
According to a study published via JAMA Network on Monday, less than 60 percent of voice assistant responses across Alexa, Siri, Google Assistant, and Microsoft Cortana include concise information on CPR when asked. Of those same services, only around a third gave any sort of actionable CPR instructions.
Speaking with CNN on August 28, lead study author Adam Landman, Mass General Brigham’s chief information officer and senior vice president of digital, as well as an attending emergency physician, explained researchers found that CPR-related answers from “AI voice assistants… really lacked relevance and even came back with inconsistencies.”
To test their efficacy, the team asked a series of eight CPR instructional questions to the four major AI assistant programs. Of those, just 34 percent provided verbal or textual instructions, while 12 percent offered only verbal answers. Less than a third of responses suggested calling emergency medical services.
Even when CPR instructions are provided, however, voice assistant and large language model text responses varied greatly by product. Of 17 instructional answers, 71 percent described hand positioning, 47 percent described depth of compression, and only 35 percent offered a suggested compression rate.
There is at least one silver-lining to AI’s middling performance grade: researchers now know where, specifically, improvement is most needed. Landman’s study team believes there is ample opportunity for tech companies to collaborate on developing standardized, empirical emergency medical information to everyday AI assistant users in times of crisis.
“If we can take that appropriate evidence-based content and work with the tech companies to incorporate it, I think there’s a real opportunity to immediately improve the quality of those instructions,” Landman told CNN.
The study authors suggest that technology companies need to build CPR instructions into the core functionality of voice assistants, designate common phrases to activate CPR instructions, and establish “a single set of evidence-based content items across devices, including prioritizing calling emergency services for suspected cardiac arrest.”