Yesterday, the ACLU of Northern California announced that in a test it recently conducted of Amazon’s facial recognition technology, called “Rekognition,” the software incorrectly identified 28 members of Congress as people who have been arrested for a crime.
To conduct the test, the ACLU of Northern California created a database of 25,000 publicly available mugshots. It then used Amazon’s facial recognition technology to compare the database against photos of all U.S. House and Senate members, ultimately flagging 28 photos of Congress members as matches to the mugshot database.
The Congress members it falsely matched with the mugshot database included Republicans and Democrats, men and women and legislators of all ages from across the country—but disproportionately people of color.
Months earlier, the Congressional Black Caucus (CBC) wrote a letter to Amazon CEO Jeff Bezos to address the “negative unintended consequences” that could result from law enforcement’s use of the face recognition technology—particularly for black people, undocumented immigrants and protesters. The CBC explained the technology was especially risky because “communities of color are more heavily and aggressively policed than white communities.”
The ACLU’s test validated the CBC’s concerns: nearly 40 percent of false matches were people of color, even though people of color comprise only 20 percent of Congress. If police agencies are to employ an error-prone, racially biased facial recognition technology, the consequences for people of color could be particularly troublesome.
“It could cost people their freedom or even their lives,” writes Jacob Snow, Technology & Civil Liberties Attorney at the ACLU of Northern California.
Snow identifies other potentially harmful consequences of the software: “If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.”
After the ACLU published it’s findings yesterday, Reps. Jimmy Gonzalez and John Lewis wrote a letter to Jeff Bezos requesting a meeting to discuss the false matches and the technology’s “potential impact on communities of color.”
Even before the ACLU of Northern California published the results of its test, the organization and other civil liberties groups had already proposed that Amazon no longer provide its facial recognition software to the government.
Snow concludes, “This technology shouldn’t be used until the harms are fully considered and all necessary steps are taken to prevent them from harming vulnerable communities.”