The California State Assembly Privacy and Consumer Protection Committee on Tuesday passed a bill regulating law enforcement’s use of facial recognition technology — but not before considering its flaws and hearing opposition from more than 50 civil rights and social justice groups.
The bill, AB 642, would authorize law enforcement to use the facial recognition technology in some instances, and only permits the use of programs that have been evaluated under the National Institute of Standards and Technology’s Face Recognition Vendor Test program and are at least 98% accurate.
The bill was passed to the appropriations committee on a 9-0 vote with two committee members abstaining, and the chair reserved the right to call the bill back into committee.
AB 642 would also require law enforcement agencies to have their own written facial recognition polices, including a provision that certain images — including those of minors, people who weren’t charged with a crime and those acquitted or exonerated of their charges — must be deleted every six months. It would also require that all uses of facial recognition technology are documented and that agencies submit annual reports detailing their use to the California State Auditor. Those reports would then be released publicly.
But dozens of civil rights groups opposed the bill, claiming it would expand the use of a biased technology. A 2019 analysis of 189 facial recognition systems by the National Institute of Standards and Technology found that “false positive rates are highest in West and East African and East Asian people, and lowest in Eastern European individuals.” It found that female, elderly and young faces also had higher false positive rates. The study found that with systems developed in China, “this effect is reversed, with low false positive rates on East Asian faces.”
Carmen-Nicole Cox, an attorney with ACLU California Action, during Tuesday’s hearing labeled AB 642 as a “justice denier.”
“AB 642 will increase invasive government surveillance powered by vast biometric databases created without our consent, that inevitably will be used against the most marginalized,” Cox said.
Cox said that more than 50 other civil rights groups also opposed the bill, and that many would rather see an outright ban on the technology.
Cox was joined on Tuesday by Robert Williams, a Michigan man who in 2020 was arrested in front of his daughter and wife and charged with a felony after being incorrectly identified by a facial recognition system.
“There are some lasting effects of this ordeal that that has had adverse effects on me and my family, especially my daughter Julia, who was five at the time of the incident still gets visibly upset and emotional when we discuss the arrest or when we rewatch the videos or news coverage of the event that took place,” Williams said. “I would like to end my time by saying I think the only answer is to not use this tech. This way you will not leave people fight cases in court and trying to right the wrongs like I’m attempting to do.”
Assembly Member Phil Ting, the bill’s author, said during Tuesday’s hearing that AB 642 is not an endorsement of the technology’s use by law enforcement.
“What this bill does is simply provide guidelines, guardrails to ensure that facial recognition software, if it’s to be used, will be used in certain circumstances and will not be used in many circumstances,” Ting said.
Ting also authored a 2019 bill that banned facial recognition’s use on footage gathered by police body-worn cameras. It included a three-year moratorium that expired Jan. 1. Ting admitted AB 642 isn’t perfect, but said the recent sunsetting of that moratorium means there are currently no regulations on facial recognition use.
“Right now, I am concerned that if we don’t pass this bill, and this bill is not signed into law, then we allow a complete free-for-all,” he said. “It becomes something that each law enforcement agency across the state will be using without any guidelines, any information, any advice and it will be completely unfettered.”
Another bill introduced this legislative session would establish a statewide moratorium on the use of facial recognition software that would last through 2033.
Cox testified in a hearing earlier this month that AB 642 was “not better than nothing.” Ting noted that his bill would not prevent local governments from creating their own rules or bans. San Francisco and others have already banned the technology.
‘Why I’m doing this’
Ting said during Tuesday’s hearing he’s open to discussing the ACLU’s concerns and that while the bill was in the Public Safety Committee, letters of concern from the opposition were used to amend the bill with protections around immigration, reproductive health care and gender-affirming health care.
“I absolutely don’t want what happened to Mr. Williams case today happening anywhere, but that’s why I’m doing this bill,” Ting said, adding that none of the concerned groups have offered any written amendments to modify the bill.
Most committee members on Tuesday said there are problems with the bill, but that the current lack of regulation on facial recognition takes precedence.
“You know, I have tremendous heartburn with the bill as it currently is,” said Assembly Member Jesse Gabriel, the committee’s chair. “But I also have tremendous heartburn with the idea that with the status quo, that is January of this year, that we no longer have a ban. There is [concern of] the Wild West, that people can do whatever they want with this technology. And that that’s something frankly, that terrifies me.”