Future of state biometric privacy ‘up in the air’ after 2024’s AI explosion
While states made strides in data privacy regulations this year, with many introducing innovations to comprehensive privacy legal frameworks, efforts to regulate biometric data privacy were overshadowed by states’ focus on artificial intelligence regulations.
Jameson Spivack, a senior policy analyst for immersive technologies at the think tank Future of Privacy Forum, told StateScoop that while there were two successes in state biometric privacy law in Colorado and Illinois this year, by and large AI cast a large shadow. State policymakers introduced nearly 700 pieces of AI legislation in 2024, and even more are predicted for 2025.
This shadow looms in the absence of a federal privacy law that would offer protections to consumer biometric data. The Federal Trade Commission in May 2023 issued a policy statement on biometric data that included broader definitions than those included in state laws, signaling that the commission might take a more staunch approach to protecting biometric data. But pivack said the incoming presidential administration and AI’s influence leaves biometric regulations “up in the air” for 2025.
Biometric amendments in AI’s shadow
In 2024, some states, like Colorado, worked on AI and biometric-specific data protection laws simultaneously. But Spivack said the sheer number of AI laws and policies introduced in states shows a shift away from efforts to protect biometric data, including depictions, images or recordings of an individual’s facial geometry, eyes, fingerprints, handprints, voice or genes.
“What we saw in 2024 was that policymakers largely moved away from biometrics and focused much more on AI, which is not particularly surprising because AI is everything right now, and so there has just been less attention on it than in recent years,” Spivack said.
The two successes in Colorado and Illinois came in the form of amendments to their existing privacy laws. Illinois passed amendments to the Illinois Biometric Information Privacy Act, or BIPA, which some privacy experts have hailed as the gold standard of biometric data laws. It includes a private right of action, which allows people to bring legal action against companies found violating the law. The BIPA amendments passed this year included a limit on the number of violations that business can be found liable for under the private right of action. The amendment largely benefited businesses, because instead of having to pay per violation, the violations will be lumped together as a singular offense, Spivack said.
Colorado this year passed both the nation’s first comprehensive AI legislation that targets AI-powered discrimination, and new amendments to its privacy law that include protections for biometric data.
Colorado Gov. Jared Polis last May signed a bill amending the state’s comprehensive privacy law, the Colorado Privacy Act, to include rights and obligations that Spivack said largely mirror Illinois’s privacy law. Colorado’s amendment does not feature a private right of action like BIPA’s, but it does add similar requirements for businesses that process biometric data, such as creating retention and destruction policies, prohibiting the sale of the data and mandating reasonable security processes for the data.
Looking to 2025
Despite the decreased attention on biometric protection in state legislatures this year, eight states have a comprehensive privacy laws taking effect in 2025, and some have provisions on biometric privacy under their “sensitive data” definitions.
Spivack said he anticipates the definition of biometric data will expand with technological progress. This may include, he said, more use of the phrase “body-based data,” data derived from a person’s characteristics, regardless of whether it can be used to identify them. The AI firm Veritone last October released a new version of its Track software for law enforcement agencies that uses characteristics such as height or distinguishing clothing, none of which is typically considered biometric data, to track people or vehicles.
“But what we’re seeing is that, particularly in emerging consumer products, a lot of body data is being collected and used but not necessarily being used for identification purposes,” Spivack said.
Spivack gave the example of his work on the privacy implications of virtual reality technologies, particularly with extended reality headsets. He said that the devices track data about how the user’s hands, body and eyes move in order to just make the technology work — but it’s not identifying data and normally isn’t stored with identifying information.
The creation and storage of body data, Spivack said, could potentially change the conversation around biometrics and have implications on policy conversations on how to protect it. For example, the FTC’s May 2023 policy statement on biometric data extended the definition to include any kind of body data, Spivack said. And Colorado has already taken steps to protect body data: In April, the state passed the nation’s first law offering privacy protections for neural data, or information related to brain or spinal activity.
But priorities at the federal level are currently unpredictable, Spivack added: President-elect Donald Trump has not yet named his pick for chair of the Federal Trade Commission. Whoever is picked to lead the FTC will have a hand in deciding how the agency enforces protections of biometric data.
“That could have real implications for how the FTC sees data practices, and how it decides what practices it decides to enforce against, if they consider certain things to implicate body biometric data, the way that they consider body biometric data to be,” Spivack said.