Find Us

Can your smartphone’s mic and photos detect heart diseases and diabetes? Google is working on it

Google's health AI team is exploring whether our phone's microphones and photos can pick up early signs of heart diseases or diabetes
Google's health AI team is exploring whether our phone's microphones and photos can pick up early signs of heart diseases or diabetes Copyright Canva
Copyright Canva
By Natalie Huet with Reuters
Published on
Share this articleComments
Share this articleClose Button

Google's health AI team is exploring whether your phone's mics and photos can pick up signs of heart diseases and diabetes.


Our mobile phones have already become our cameras, our television screens, our diaries. What if they also became stethoscopes and disease-screening tools that help us identify health problems at home?

Well, Google is working on it.

The Alphabet-owned company plans to test whether the heart sounds and photos of our eyes captured by smartphones can help users spot signs of illnesses before they even see a doctor.

Google is specifically investigating whether a smartphone's built-in microphone can detect heartbeats and murmurs when placed over the chest, Greg Corrado, the company's head of health artificial intelligence (AI), said on Thursday.

Readings could help early detection of heart valve disorders, he said.

"It's not at the level of diagnosis but it is at the level of knowing whether there is an elevated risk," he told reporters, adding that questions remained about accuracy.

Eye screening for diabetes

The eye research is focused on using photos to detect diseases such as those related to diabetes, which can damage blood vessels in the retina and is a major cause of blindness in adults.

Google's health unit is already using AI to help healthcare workers detect diabetic retinopathy in India and Thailand, with nearly 100,000 patients screened so far.

The tech giant said it had already reported "early promising results" using tabletop cameras in clinics, and that it would now examine whether smartphone photos might work, too.

"While this is in the early stages of research and development, our engineers and scientists envision a future where people, with the help of their doctors, can better understand and make decisions about health conditions from their own homes," Corrado wrote in a blog post.

The projects follow announcements made last year about measuring heart and breathing rates using smartphone cameras - features now available on many devices through the Google Fit app.

Google has also introduced Derm Assist, a mobile app that uses AI to analyse users’ photos of a skin, hair or nail concern alongside their answers to a few questions to provide them with a list of possible conditions. 

AI for ultrasound scans

Google also plans to test whether its AI software can analyse ultrasound scans taken by less-skilled technicians, as long as they follow a set pattern.

The technology could address shortages in higher-skilled workers, particularly in low-to-middle-income countries, and allow birthing parents to be evaluated at home.

While Google has long sought to bring its technical expertise to health care, it has said little about whether the efforts are generating significant revenue or usage.

Corrado said launching such projects was "a major step" but adoption would take time.

"When you think about breathing and heart rate, whatever level of adoption we see today only scratches the surface," he told reporters.

Share this articleComments

You might also like