< BackHome/news/aws-rekognition-bias

Bias in the Face

research

One size doesn't always fit all. This week, the results of an updated study  unveiled at the World Economic Forum in Davos by MIT researcher Joy Buolamwini revealed that Amazon's of-the-shelf Rekognition service misidentified the gender of women 19 percent of the time and classified darker-skinned women as men 31 percent of the time (versus Microsoft's 1.5 percent error rate on the latter group).  Amazon says the tests were conducted on its facial analysis technology, which looks for specific features—goatees or smiles—on faces, versus facial recognition, which specifically looks for identity matches with other images, and that its newest update is more accurate and reduces bias. We'll have to wait for an updated reevaluation, but since Amazon is still sharing its technology with law enforcement and other agencies without doing much auditing, per the New York Times, it's a good time to consider the limits and risks of off-the-shelf AI-as-a-Service offerings.

View Full Article >

sign up for our newsletter