AI-powered police body cameras are renewing privacy and bias concerns

The growing use of police body cameras that use artificial intelligence is raising alarms about privacy violations, racial bias and a lack of oversight, according to a report published Tuesday by the R Street Institute, a Washington think tank.
Body-worn cameras, initially introduced in the 2010s to improve transparency and accountability during interactions with the public, are now standard equipment in many police departments across the country. But R Street’s report suggests that the addition of AI has created new risks, such as misidentifying people and collecting sensitive data without consent.
“The line between public security and state surveillance lies not in technology, but in the policies that govern it,” the report warns, citing the increasing integration of facial recognition and real-time video analytics into law enforcement tools.
To combat these risks, the report recommends stricter state regulations, including requiring warrants for facial recognition and establishing higher accuracy thresholds, limiting data retention and mandating regular audits to identify racial or systemic bias.
Logan Seacrest, one of the report’s authors, also emphasized the importance of keeping humans “in the loop” when it comes to oversight.
“Not letting the AI kind of make any final decisions by itself, before running by a team of law enforcement professionals, attorneys, software engineers, whoever needs to be there providing supervision [or] final decisions about flagging people for arrest, flagging officers,” Seacrest said. “That stuff really should remain in human hands.”
The report shows that privacy advocates are growing increasingly concerned about how footage is captured, used and stored. Body-worn cameras don’t just record crimes — they often capture people in distress, experiencing medical emergencies or inside their homes. Some police departments work with technology companies like Clearview AI and Palantir to help analyze footage in real time, often without clear rules or transparency guidelines.
“Predictive systems can also open the door to invasive surveillance, and without clear and enforceable policies to protect civil liberties, these tools could be abused by bad actors,” the report reads.
Police officers in New Orleans recently came under fire for using facial recognition technology across a private network of more than 200 surveillance cameras equipped with AI in violation of city policy. In response, the city proposed an ordinance that would allow police broad use of facial recognition technology.
Seacrest said the backlash was in that case “completely predictable.”
“This was a private facial recognition network that was operating outside the bounds of the law raises obvious civil liberty concerns, basically a warrantless algorithm dragnet here,” he said.
The R Street report also highlights the disproportionate impact AI mistakes can have on communities of color, citing a 2020 incident involving Robert Williams, a Black man who was wrongfully arrested in Michigan after being misidentified by a facial recognition system.
Several states have taken action to address these concerns. California passed legislation prohibiting the use of facial recognition on police-worn cameras, though the law expired in 2023. The Illinois legislature recently strengthened its Law Enforcement Officer-Worn Body Camera Act, mandating retention limits, prohibiting live biometric analysis and requiring officers to deactivate recordings under certain circumstances.
“There’s nothing inherently incompatible about AI and civil liberties, or AI and privacy with proper democratic oversight,” Seacrest said. “Those same tools that authoritarian regimes use to basically monitor and control the population can be applied here in the U.S., in accordance with the Constitution, benefiting all Americans. It’s really just a matter of the guardrails that we put in place for it.”
The report concludes that government oversight of AI-powered body cameras remains inconsistent, especially with there being few national standards that regulate the use of AI in policing. Seacrest said that isn’t necessarily a bad thing.
“I think regulations are often best if they are created and actuated closest to the people that they affect,” said Seacrest. “So the use of body camera AI should really be done at the state and local level.”