Google’s AI doctor appears to be getting better

It's all part of the company's grand mission to make personalized health info more accessible.
Dr. Alan Karthikesalingam presenting at the Google health event.
Dr. Alan Karthikesalingam presenting at the Google health event. Google / YouTube

Share

Google believes that mobile and digital-first experiences will be the future of health, and it has stats to back it up—namely the millions of questions asked in search queries, and the billions of views on health-related videos across its video streaming platform, YouTube. 

The tech giant has nonetheless had a bumpy journey in its pursuit to turn information into useful tools and services. Google Health, the official unit that the company formed in 2018 to tackle this issue, dissolved in 2021. Still, the mission lived on in bits across YouTube, Fitbit, Health AI, Cloud, and other teams. 

Google is not the first tech company to dream big when it comes to solving difficult problems in healthcare. IBM, for example, is interested in using quantum computing to get at topics like optimizing drugs targeted to specific proteins, improving predictive models for cardiovascular risk after surgery, and cross-searching genome sequences and large drug-target databases to find compounds that could help with conditions like Alzheimer’s.

[Related: Google Glass is finally shattered]

In Google’s third annual health event on Tuesday, called “The Check Up,” company executives provided updates about a range of health projects that they have been working on internally, and with partners. From a more accurate AI clinician, to added vitals features on Fitbit and Android, here are some of the key announcements. 

A demo of how Google’s AI can be used to guide pregnancy ultrasound. Charlotte Hu

For Google, previous research at the intersection of AI and medicine have covered areas such as breast cancer detection, skin condition diagnoses, and the genomic determinants of health. Now, it’s expanding its AI models to include more applications, such as cancer treatment planning, finding colon cancer from images of tissues, and identification of health conditions on ultrasound. 

[Related: Google is launching major updates to how it serves health info]

Even more ambitiously, instead of using AI for a specific healthcare task, researchers at Google have also been experimenting with using a generative AI model, called Med-PaLM, to answer commonly asked medical questions. Med-PaLM is based on a large language model Google developed in-house called PaLM. In a preprint paper published earlier this year, the model scored 67.6 percent on a benchmark test containing questions from the US Medical License Exam. 

At the event, Alan Karthikesalingam, a senior research scientist at Google, announced that with the second iteration of the model, Med-PaLM 2, the team has bumped its accuracy on medical licensing questions to 85.4 percent. Compared to the accuracy of human physicians, sometimes Med-PaLM is not as comprehensive, according to clinician reviews, but is generally accurate, he said. “We’re still learning.” 

An example of Med-PaLM’s evaluation. Charlotte Hu

In a language model realm, although it’s not the buzzy new Bard, a conversational AI called Duplex is being employed to verify whether providers accept federal insurance like Medicaid, boosting a key search feature Google first unveiled in December 2021. 

[Related: This AI is no doctor, but its medical diagnoses are pretty spot on]

On the consumer hardware side, Google devices like Fitbit, Pixel, and Nest will now be able to provide users with an extended set of metrics regarding their heart rate, breathing, skin temperature, sleep, stress, and more. For Fitbit, the sensors are more evident. But the cameras on Pixel phones, as well as the motion and sound detectors on Nest devices, can also give personal insights on well-being. Coming to Fitbit’s sleep profile feature is a new metric called stability, which tells users when they’re waking up in the night by analyzing their movement and heart rate. Google also plans to make a lot more of its health metrics, like respiration, which uses a camera and non-AI algorithms to detect movement and track pixels, and heart rate, which relies on an algorithm that measures and changes in skin color, available to users with compatible devices without a subscription. 

Users can take their pulse by placing their fingertip over the back cameras of their Pixel phones. Charlotte Hu

This kind of personalization around health will hopefully allow users to get feedback on long-term patterns and events that may deviate from their normal baseline. Google is testing new features too, like an opt-in function for identifying who coughed, in addition to counting and recording coughs (both of which are already live) on Pixel. Although it’s still in the research phase, engineers at the company say that this feature can register the tone and timber of the cough as vocal fingerprints for different individuals. 

Watch the full keynote below:

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.