Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

mahatmakanejeeves

(61,437 posts)
Fri Jan 25, 2019, 01:04 PM Jan 2019

Amazon facial-identification software used by police falls short on tests for accuracy and bias,...

David Fahrenthold Retweeted

Amazon's facial-recognition software, marketed 2 law enforcement as a powerful crime-fighting tool, struggles 2 pass basic tests o/accuracy, raising concerns abt how biased results could tarnish AI's exploding use by police & surveillance. By @drewharwell



Technology
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

By Drew Harwell January 25 at 11:01 AM
Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person’s gender, new research released Thursday says.

Researchers with M.I.T. Media Lab also said Amazon’s Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology’s use by police and in public venues, including airports and schools.

Amazon’s system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.

The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men. The research shows, however, that some systems have rapidly grown more accurate over the past year after greater scrutiny and corporate investment into improving the results.
....

Drew Harwell is a national technology reporter for The Washington Post specializing in artificial intelligence. He previously covered national business and the Trump companies. Follow https://twitter.com/drewharwell
Latest Discussions»Issue Forums»Civil Liberties»Amazon facial-identificat...