Monday

May 20th , 2024

FOLLOW US

GOOGLE UNVEILS NEW 10-SHADE SKIN TONE SCALE TO TEST AI FOR BIAS

featured img

Google (GOOGL.O) of Alphabet Inc. (GOOGL.O) revealed a palette of ten skin tones on Wednesday, touting it as a step forward in creating gadgets and apps that better serve people of color.

 

The company claims that its new Monk Skin Tone Scale replaces the Fitzpatrick Skin Type, a flawed six-color standard that had become popular in the tech industry for determining whether smartwatch heart-rate sensors, artificial intelligence systems with facial recognition, and other products show color bias.

 

Fitzpatrick underrepresented persons with darker complexion, according to Tech experts. Last year, Reuters claimed that Google was working on an alternative. continue reading

 

The business teamed up with Ellis Monk, a sociologist at Harvard University who studies colorism and felt dehumanized by cameras that failed to detect his presence.

 

Fitzpatrick, according to Monk, is excellent at recognizing distinctions in lighter skin. However, because the majority of people are darker, he desired a scale that "does a better job for the bulk of the world," he explained.

 

Monk picked 10 tones using Photoshop and other digital art tools, which is a workable number for those who help train and evaluate AI systems. He and Google polled 3,000 people throughout the United States and discovered that a large percentage of them felt a 10-point scale fit their skin just as well as a 40-shade pallet.

 

The Monk scale is "a good balance between being representative and being tractable," according to Tulsee Doshi, head of product for Google's responsible AI team.

 

It's already being used by Google. Beauty-related Searches for "bridal makeup looks" on Google Images are now possible.  Using Monk to filter results Images with varied skin tones are now displayed when searching for "cute infants."

 

 

 

The Monk scale is also being used to ensure that a wide range of individuals are happy with Google Photos' filter options and that the company's face-matching tech is not prejudiced.

 

 

 

Still, if companies don't have enough data on each of the tones, or if the people or tools used to assess others' skin are prejudiced by lighting variances or personal views, flaws could creep into products, according to Doshi.

Meet the Author


PC
Emmanuel Amoabeng Gyebi

Content writer

follow me

Connect and interact with amazing Authors in our twitter community