King County, Washington, bans facial recognition
King County, Washington, on Tuesday became the first county in the country to ban government use of facial recognition, following similar bans in San Francisco; Pittsburgh; Boston; Portland, Oregon; and other large cities.
The ban, which the nine-member King County Metropolitan Council passed unanimously, will prevent all county agencies, including the sheriff’s office, from accessing or using facial recognition technology, though no county agency is currently using it, according to Sgt. Tim Meyer of the King County Sheriff’s Office.
The ban does not affect the use of facial recognition by agencies in Seattle, King County’s largest city, and private citizens can still use the technology, the bill reads, but people can sue the county if they find out that facial recognition was used to identify them.
“The use of this technology is invasive, intrusive, racially biased and full of risks to fundamental civil liberties,” King County Councilmember Dave Upthegrove said in a press release.
The ban was supported by advocacy organizations, like the American Civil Liberties Union, which has helped facilitate similar bans across the country, often with similarly-worded legislation that cites the privacy and racial bias concerns associated with facial recognition. The National Institute of Standards and Technology published research in 2019 that confirmed the algorithms available at that time were in fact more accurate for white males than for other races or genders, though defenders of facial-recognition technology say it’s improved since then and that accusations of bias are no longer viable.
“It is polarized. The trend that we’re seeing here isn’t large numbers of local governments adopting this, but a certain blue counties, blue cities type adopting it, and I think that reflects the polarization,” said Daniel Castro, the vice president of the Information Technology and Innovation Foundation, a tech-industry think tank supported by facial-recognition vendors like Amazon. “Partially because of the fact that the rationale that they’re using for the ban — which is that the technology is fundamentally flawed — simply isn’t true.”
Castro said ITIF disputed the conclusions that advocacy groups reached when reviewing the NIST research because it allowed as many as 200 different algorithms to be tested, but noted that the top-performing platforms were quite accurate.
Because there’s “a stark difference between the best and worst algorithms,” Castro said, outright bans on all forms of the technology, except for allowing law enforcement or government agencies to comply with the National Child Search Assistance Act, are unnecessary.
“The best-performing systems don’t have bias, so that argument simply doesn’t hold water anymore,” Castro said.