June 28, 2021- 10:20 a.m.
Alphabet’s Google told Reuters this week it is developing an alternative to the industry standard method for classifying skin tones, which a growing chorus of technology researchers and dermatologists says is inadequate for assessing whether products are biased against people of color.
At issue is a six-color scale known as Fitzpatrick Skin Type (FST), which dermatologists have used since the 1970s. Tech companies now rely on it to categorize people and measure whether products such as facial recognition systems or smartwatch heart-rate sensors perform equally well across skin tones.
The controversy is part of a larger reckoning over racism and diversity in the tech industry, where the workforce is more white than in sectors like finance. Ensuring technology works well for all skin colors, as well different ages and genders, is assuming greater importance as new products, often powered by artificial intelligence (AI), extend into sensitive and regulated areas such as healthcare and law enforcement.
Companies know their products can be faulty for groups that are under-represented in research and testing data. The concern over FST is that its limited scale for darker skin could lead to technology that, for instance, works for golden brown skin but fails for espresso red tones.
Numerous types of products offer palettes far richer than FST. Crayola last year launched 24 skin tone crayons, and Mattel Inc’s Barbie Fashionistas dolls this year cover nine tones.
The issue is far from academic for Google. When the company announced in February that cameras on some Android phones could measure pulse rates via a fingertip, it said readings on average would err by 1.8% regardless of whether users had light or dark skin.
“We are working on alternative, more inclusive, measures that could be useful in the development of our products, and will collaborate with scientific and medical experts, as well as groups working with communities of color,” the company said, declining to offer details on the effort.
The controversy is part of a larger reckoning over racism and diversity in the tech industry, where the workforce is more white than in sectors like finance. Ensuring technology works well for all skin colors, as well different ages and genders, is assuming greater importance as new products, often powered by artificial intelligence (AI), extend into sensitive and regulated areas such as healthcare and law enforcement.
The company later gave similar warranties that skin type would not noticeably affect results of a feature for filtering backgrounds on Meet video conferences, nor of an upcoming web tool for identifying skin conditions, informally dubbed Derm Assist.