What you missed at Google’s health event

Google believes that mobile and digital-first experiences will be the future of health, and it has the statistics to back it up—namely the millions of questions asked in search queries and the billions of views on health-related videos on its streaming video platform , YouTube.

The tech giant has, however, had a difficult journey in its quest to turn information into useful tools and services. Google Health, the official unit the company established in 2018 to address this issue, was disbanded in 2021. However, the mission continued in bits and pieces at YouTube, Fitbit, Health AI, Cloud and other groups.

Google isn’t the first tech company to dream big when it comes to solving tough problems in healthcare. IBM, for example, is interested in using quantum computing to address topics such as optimizing drugs that target specific proteins, improving predictive models for cardiovascular risk after surgery, and cross-searching genome sequences and large databases of drug targets to find compounds that could help with conditions such as Alzheimer’s.

[Related: Google Glass is finally shattered]

At Google’s third annual health event Tuesday, dubbed “The Check Up,” company executives provided updates on a number of health projects they’ve been working on internally and with partners. From a more accurate AI clinic, to additional vital features on Fitbit and Android, here are some of the key announcements.

A demonstration of how Google’s AI can be used to guide pregnancy ultrasound. Charlotte Hu

For Google, previous research at the intersection of AI and medicine has covered areas such as breast cancer detection, skin disease diagnoses and genomic determinants of health. Now, it is expanding its AI models to include more applications, such as cancer treatment planning, finding colon cancer from tissue images, and identifying health conditions in ultrasound.

[Related: Google is launching major updates to how it serves health info]

Even more ambitiously, instead of using AI for a specific healthcare task, Google researchers are also experimenting with using a generative AI model, called Med-PaLM, to answer common medical questions. Med-PaLM is based on a large language model developed internally by Google called PaLM. In a preprint paper published earlier this year, the model scored 67.6 percent on a benchmark test containing questions from the US medical licensing exam.

At the event, Alan Karthikesalingam, a senior researcher at Google, announced that with the second iteration of the model, Med-PaLM 2, the team increased its accuracy on medical license questions to 85.4%. Compared to the accuracy of human doctors, Med-PaLM is sometimes not as complete, according to clinician reviews, but is generally accurate, he said. “We’re still learning.”

Google's AI doctor seems to be getting better
An example of Med-PaLM evaluation. Charlotte Hu

In the realm of language models, though not the buzzy new Bard, a conversational AI called Duplex is being used to verify whether providers accept federal insurance like Medicaid, bolstering a key search feature that Google first introduced in December 2021.

[Related: This AI is no doctor, but its medical diagnoses are pretty spot on]

On the consumer hardware side, Google devices like Fitbit, Pixel, and Nest will now be able to provide users with an extensive set of metrics about heart rate, breathing, skin temperature, sleep, stress and other. For the Fitbit, the sensors are more prominent. But the cameras on Pixel phones, as well as the motion and sound detectors on Nest devices, can also provide personal information about well-being. Coming to Fitbit’s sleep profile feature is a new metric called stability, which tells users when they’re waking up during the night by analyzing their movement and heart rate. Google also plans to do a lot more of its health metrics, such as respiration, which uses a camera and non-AI algorithms to detect motion and track pixels, and heart rate, which is based on an algorithm that measures and changes to skin color, available to users with compatible devices without a subscription.

Google's AI doctor seems to be getting better
Users can measure their pulse by placing their finger over the rear cameras of their Pixel phones. Charlotte Hu

This type of health personalization will hopefully allow users to receive feedback on long-term patterns and events that may deviate from their normal baseline. Google is also testing new features, such as an opt-in function to detect who coughed, in addition to counting and logging coughs (both of which are already live) on the Pixel. Although still in the research stage, the company’s engineers say this feature can capture the tone and timbre of a cough as vocal fingerprints for different people.

Watch the entire keynote below:

Leave a Reply

Your email address will not be published. Required fields are marked *