The software, which Amazon is hawking to US law enforcement agencies, "incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime," the civil rights group wrote Thursday.
Sputnik News reported on Rekognition in May, at a time when Amazon was selling access to the tech to at least one sheriff's department for an amount between $6 and $12 a month, on the condition that they trumpet its boons to other customers, including body camera manufacturers.
According to the ACLU then, Amazon is "marketing Rekognition for government surveillance."
The technology is reportedly capable of identifying up to a hundred people in a single image. "Unlike anything else, it handles real-time video," Amazon Web Services CEO Andy Jassy said in a video published November 30, 2017. That would make it perfect to pair it with city surveillance camera systems, which Orlando, Florida, did for a time.
Later in May, Sputnik News reported that two congressional Democrats, Keith Ellison and Emanuel Cleaver, had penned a letter to Amazon CEO Jeff Bezos demanding a list of police departments using the tech in addition to other requests.
— Jake Snow (@snowjake) July 26, 2018
"A series of studies have shown that face recognition technology is consistently less accurate in identifying the faces of African-Americans and women as compared to Caucasians and men. The disproportionally high arrest rates for members of the black community make the use of facial recognition technology by law enforcement problematic, because it could serve to reinforce this trend," the letter says.
Now it seems the ACLU is trying to shore up more opposition from Congress against the program. The group's findings also validated the fears of Cleaver and Ellison. In the ACLU's test, Rekognition turned up false matches for six members of the Congressional Black Caucus. Forty percent of the total false matches from the US Congress were people of color, even though people of color make up only 20 percent of Congress.
The ACLU notes that they purchased Rekognition for just $12.33, "less than a large pizza."
— Matt Cagle (@Matt_Cagle) July 26, 2018
Their methodology followed Rekognition orthodoxy to a tee, although Amazon now disputes it. First, they built the database of photographs of people who have been arrested (which doesn't make them de-facto criminals, as mugshots are taken prior to a conviction by trial). They used 25,000 photos that are publicly available. The Washington County Sheriff's Office in Oregon, which was the first reported police purchaser of Rekognition, built a database of 30,000, Sputnik News reported.
The ACLU "used the default match settings that Amazon sets for Rekognition." That, according to Amazon, is an "80 percent confidence" threshold for the software to categorize a comparison as a match.
Amazon rebutted the ACLU's findings in a lengthy statement issued to Business Insider, arguing that the civil rights group failed to follow best practices for the software. "While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals or other social media use cases, it wouldn't be appropriate for identifying individuals with a reasonable level of certainty," the company said.
The company said it recommends law enforcement customers set the threshold to 95 percent or higher, and it emphasized that Rekognition is not supposed to be used autonomously, but with a human partner.
Nonetheless, the ACLU is concerned with dangerous — potentially deadly — scenarios the technology could foment. "If law enforcement is using Amazon Rekognition, it's not hard to imagine a police officer getting a ‘match' indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins," they wrote. That same fear was espoused prior in the Democrats' letter to Bezos.
"A recent incident in San Francisco provides a disturbing illustration of that risk," the ACLU continued. "Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle."