Google previewed its AI-powered dermatology assist tool
at I/O, its yearly developer conference.
Integrated with Search, the app guides you through taking photos of your skin at different angles, and then uses a deep learning model published in Nature Medicine
to potentially detect one of 288 skin conditions.
(See how it works in this GIF
.) The tool is explicitly not intended to provide a diagnosis or as a substitute to medical advice.
Although this theoretically sounds incredible — internet-scale access to early-stage detection of e.g.
skin cancer could be an amazing global DALY
booster — experts have raised some serious concerns.
Google Ethical AI researcher Dr.
, Stanford Medicine dermatologist Roxanna Daneshjou MD/PhD
and Vice journalist Todd Feathers
have pointed out that, although Google claims to have tested the app across all demographics,
it has not sufficiently tested it across all (Fitzpatrick
) skin types:
the darkest V and VI types — where skin conditions are already misdiagnosed relatively often
— were severely underrepresented in the dataset.
The app isn’t live yet, and Google Health spokesperson Johnny Luu told Vice that the dataset has been expanded since the Nature
paper was published, but this issue must be properly addressed before the app can responsibly be launched.
I’d be disappointed to see it go live without at the very least a Datasheet and a Model Card
explaining its limitations.