Google's new AI skincare tool may not work on patients with darker skin tones

The Google skin tool does not recognise all skin tones.
The Google skin tool does not recognise all skin tones. Copyright Canva
By Tom Bateman
Share this articleComments
Share this articleClose Button

Google’s new tool can spot 288 different skin conditions. But some experts fear it's based on data that lacks darker skin types.

ADVERTISEMENT

Google has unveiled a preview of a new AI-powered tool that it says could help identify problems with your skin.

The tool, announced at the firm's Google IO conference last week, uses artificial intelligence to identify possible skin conditions. It’s not, however, intended as a substitute for medical advice, the tech giant said.

To use it, patients need to upload three photos of a problem area to the tool, which Google says can recognise 288 different skin, nail, and hair conditions.

According to a paper published by Google researchers in the journal Nature Medicine last year, the tool was developed using a set of around 65,000 anonymised images and case data of diagnosed conditions, taken from a total of 16,114 individual cases.

But some experts said the images cited in that study were not representative of all skin tones, which dermatologists rank according to the Fitzpatrick skin type scale with Type I skin being the palest and Type VI being the darkest.

AI training images lack diversity

Dr Roxana Daneshjou, a dermatologist from Stanford University who researches the use of machine learning in medicine, told Euronews that Google's study data appeared not to include many patients with darker skin types.

"The only published data shows that they had one individual of the darkest skin type (type VI) and only 2.7 per cent of the second darkest skin type (type V)," she said.

A lack of examples of darker skin types in the tool's training could negatively affect patients of colour who use it, according to Dr Tereza Hendl, co-lead of the META project at the University of Augsburg which researches the digitalisation of healthcare.

"There is much evidence showing that persistent racial and gender inequality in society gets mirrored in skewed AI training data sets and biased data labelling, which translates into algorithmic bias and leads to negative outcomes in structurally marginalised people," Hendl told Euronews.

In a statement, a Google spokesperson said the data featured in the Nature Medicine study was out of date.

"Our work is the culmination of more than three years of ongoing development. Following our initial research in Nature Medicine and JAMA Network Open, we have continued to refine the technology that our tool is built upon, including sourcing additional datasets," he said.

The EU legal system prohibits discrimination also on the grounds of race and ethnicity, hence, medical tools should not discriminate against patients with brown or black skin tones
Dr Tereza Hendl
Co-lead of the META project, University of Augsburg

“Equity has – and will continue to be – a key focus and this includes working with datasets that are inclusive of different ethnicities, skin types, and age groups, and partnering with clinicians and experts who have experience working with communities of color".

Daneshjou was open to the possibility of a more diverse dataset improving the tool's potential performance for patients of colour, although she highlighted that the currently available data do not suggest such an outcome.

"If they have additional data, which they certainly may, I would love to see it published. I would love for the application to work well in darker skin tones; I just haven't seen data that definitively shows that," she said.

A subsequent study published by Google researcher Dr Yun Liu in April has acknowledged the lack of diversity in the images used.

"In terms of Fitzpatrick skin types… types I and V are underrepresented, and type VI is absent in this data set. Because disease can present differently across skin types, the further study of additional skin types is warranted," the study said.

Questions over EU approval

The tool, which is not currently available to the public, has been approved for use in the EU as a Category I medical tool – the lowest risk category that covers other non-invasive tools like stethoscopes.

It is not currently approved for use in the United States.

ADVERTISEMENT

Hendl told Euronews that the approval raised concerns over the standards that were being met.

"The EU legal system prohibits discrimination also on the grounds of race and ethnicity, hence, medical tools should not discriminate against patients with brown or black skin tones," she said.

"One would think it imperative that processes of certification of medical technology should check for and rule out any racial bias.

"The Google app was also certified without a proper clinical trial and it is alarming that an untested technology gets approved to provide diagnostic information, indeed without additional consultation with a qualified health professional".

In response, Google's spokesperson stressed that the tool was not intended as a substitute for qualified medical advice, but rather as an advanced search tool, adding that internet users currently make almost 10 billion Google searches for skin, hair, and nail conditions each year.

ADVERTISEMENT

Despite her reservations over the diversity of skin types contained within Google's published data, Daneshjou said there was still a place for tech in medicine.

"Tech companies are innovative and can help us ‘rethink’ about how we do things in medicine," she told Euronews.

"However, at the same time, we have to remember that anything we implement affects human lives. In medicine, we require clinical trials to make sure that our interventions work in the intended use setting and don't have unforeseen outcomes".

Share this articleComments

You might also like