At this week's I/O conference, Google introduced an innovative AI tool designed to assist users in identifying skin, hair, and nail issues. A pilot launch is planned later this year.
Google I/O, the company's annual spring event typically held at San Francisco's Moscone Center, went virtual this year due to the pandemic. CEO Sundar Pichai and his team showcased key innovations, including a web-based AI tool that helps users diagnose skin, hair, and nail conditions.
Here's how it works: Users simply take three photos of the affected area using their smartphone camera—for instance, a rash on the forearms. The tool then asks targeted questions about skin type and other symptoms.
Powered by training on 288 diseases, the AI delivers potential diagnoses, factoring in age, skin type, gender, and race for accuracy.
In tests with nearly 1,000 images from diverse patients, Google's tool correctly identified the condition in one of three top suggestions 84% of the time. The team is collaborating with Stanford University researchers to validate its performance in clinical settings.
Google targeted skin conditions due to high search volumes for these issues on its platform.
“We see around 10 billion queries annually about skin, nails, or hair conditions,” says Karen DeSalvo, Chief Health Officer at Google Health, in an interview with The Verge. “Two billion people worldwide face dermatological issues, yet specialists are in short supply. While many turn to Google first, describing skin problems in words is challenging.”
This tool won't replace dermatologists—a doctor's evaluation remains essential for accurate diagnosis. “Our goal is to provide reliable information to empower better-informed decisions about next steps,” DeSalvo adds.
The tool has earned Class I medical device status in the European Union as a low-risk device but awaits U.S. FDA review.