Facial recognition software tallies second wrongful arrest
A pair of wrongful arrests of Black men in Detroit is being blamed on facial recognition software’s habit of misidentifying non-white individuals, the American Civil Liberties Union said Friday. The ACLU’s statement came after it was reported that larceny charges filed in May 2019 against a 25-year-old man were dropped four months later after law enforcement authorities realized a facial recognition platform had led them to arrest the wrong person.
The man, now-26-year-old Michael Oliver, was accused of reaching into a teacher’s car and snatching a mobile phone and damaging it during a schoolyard brawl, the Detroit Free Press reported. A facial-recognition analysis of video the phone recorded led police to Oliver, who was arrested and charged with felony larceny, but promptly told authorities, “It wasn’t me.”
When Oliver’s lawyer, the Free Press reported, got Wayne County prosecutors to take a closer look at the original video taken by the teacher’s phone, the error was obvious: Oliver’s arms are generously tattooed, while the suspect in the video does not appear to have any tattoos; the hairstyles and body types are also different. The charges against Oliver were dropped in September.
But the revelation of Oliver’s case again raises concerns that racial bias is baked into the algorithms that facial recognition platforms use. Last month, Detroit police arrested 42-year-old Robert Williams and held him for 30 hours on charges that he stole watches from Shinola, an upscale fashion brand based in the city, after a facial-recognition program used by the Michigan State Police mismatched footage from the store’s surveillance cameras with Williams’ driver’s license.
According to the ACLU, Williams and Oliver are the first two known cases of someone being wrongfully arrested because of a facial recognition error.
“We warned Robert Williams would not be the only person to be wrongfully accused of a crime they did not commit because of a flawed technology law enforcement should not be using,” Dan Korobkin, the legal director of the ACLU of Michigan, said in a press release. “Sadly, it appears we were right and there are still likely many more people we will learn about nationwide.”
The ACLU has long opposed facial recognition technology by law enforcement, and has lobbied, sometimes successfully, for prohibitions on its use in several major cities, including San Francisco and Boston. But multiple studies have shown that the technology is most successful at identifying white men; a federal report last December found that some platforms misidentify Asian and Black people 100 times more often than white people.
The push to scale back or eliminate the use of facial recognition also took on new strength last month with the Black Lives Matter protests following the police killing of George Floyd. In early June, IBM, Microsoft and Amazon — which has aggressively marketed its Rekognition software to police departments around the country — announced they would temporarily halt sales of their platforms to police.
The Detroit Police Department uses a facial recognition platform sold by DataWorks Plus, a law enforcement tech vendor. At a public meeting shortly after Williams’ case was revealed, Detroit Police Chief James Craig admitted the system misidentifies people 96% of the time. The department also said it would stop using the technology to analyze video footage, and will now only scan still photographs during investigations of violent crimes.
But the ACLU would like to see a complete ban.
“Detroit police’s new policy is a fig leaf that provides little to no protection against a dangerous technology subjecting an untold number of people to the disasters that Robert Williams and Michael Oliver have already experienced,” Korobkin’s statement continued. “This technology is dangerous when wrong and dangerous when right.”
On Wednesday, Black lawmakers in the Michigan Legislature called for a statewide ban.