Advertisement

Trump’s state AI-law order sparks clash between states and industry

Proponents of the president's executive order preempting state AI laws argue it's designed only to skirt the most stringent laws. Others say the order sets a dangerous precedent of undermining states' rights.
Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
us chamber of commerce
(Getty Images)

Following President Donald Trump’s executive order last week challenging state laws that regulate the private sector’s use of artificial intelligence, public and private sector leaders are clashing over the order’s infringement on states’ rights and whether the order actually alleviates burdens on businesses.

Some of the disagreements can be attributed to the order’s vagueness around which laws it preempts and its generalizations about the purported effects of state AI laws on business operations and interstate commerce. Proponents of the executive order say that rolling back some of the more stringent state AI laws, such as those on the books in Colorado and California, will lessen the onus on businesses, particularly small businesses that want to use AI but are worried about complying with a patchwork of laws across many states.

Critics say the order’s ambiguity leaves much to be desired of a federal stance on AI, and that the direct confrontation with state sovereignty will open the door to legal challenges. But even more acutely, they claim that if state laws are made ineffective while waiting for the federal government to act, people will be harmed by these technologies.

What’s covered

Advertisement

Trump’s executive order features carveouts for certain laws and policies that govern states’ own internal uses of AI, as well as for child safety protections and data center and AI infrastructure. Rather, the order is focused on preventing states from enforcing AI laws that “burden, restrict, or interfere with” commerce between states or his administration’s yet-to-be-written national AI framework.

This could mean that the administration will evaluate state laws that place requirements on AI developers, vendors or deployers; those that have to do with consumer protection, transparency, safety, civil rights rules and other areas of regulation that are outside state government; and those that offer enforcement mechanisms that affect companies operating across state lines.

Jake Parker, senior director of government relations for the Security Industry Association, said that confusion about exactly which laws or regulations the order covers has created a misconception that state authority is being wiped out. Instead, he argued, the order is focused on broad laws, such as Colorado’s comprehensive AI law, that require businesses to meet varying compliance requirements.

“It makes it clear that this is focused on avoiding broad regulation, not legislation addressing very specific issues with AI use, like the Elvis Act in Tennessee,” Parker said, referencing a law designed to bolster musicians’ legal ownership of their works. “There was some confusion earlier in the summer when the reconciliation bill was being considered about whether this preemptive legislation would affect that. And it obviously hasn’t been definitively decided, but I don’t think so. That’s not the kind of thing that they’re worried about.”

The preemption debate in Congress has also been mischaracterized, he said, noting that the 99-1 vote against the AI law preemption language included in the budget bill did not demonstrate strong opposition to preemption. Parker claimed that because lawmakers expected the overall measure to fail anyway, the vote is not a good measurement of support for preemption against the “50‑state hodgepodge.”

Advertisement

State pushback

While the true level of support for the measure at the federal level might be unclear, state legislators have not been shy in vocalizing their critiques. Several Illinois lawmakers have said they intend to continue regulating AI despite Trump’s order. And on Monday, several California lawmakers submitted a letter to Congress urging them to protect state authority over AI regulation.

Though state governments’ internal uses of AI are carved out of the order’s scope, some state laws could be affected if they extend to local governments, schools or other public entities. Laws interpreted as regulating AI vendors beyond the terms of their contracts with state and local government agencies might also be affected.

In June, a coalition of groups representing state technology leaders — the National Association of State Chief Information Officers, the National League of Cities, the Council of State Governments, the National Association of Counties, the United States Conference of Mayors and the International City/County Management Association — authored a joint letter to congressional leaders opposing the measure when it was floated over the summer as 10-year moratorium.

“By restricting state and local governments from enacting any policy or law regarding AI and GenAI for the next 10 years, Congress would be enacting significant overreach into state and local authority, while threatening the ability of state and local governments to responsibly regulate emerging technologies in ways that best serve their communities,” the letter read.

Advertisement

At the end of November, a bipartisan group of 36 state attorneys general sent a letter to Congress opposing AI preemption. While noting the benefits of AI, the attorneys general said the risks it poses, such as scams, deepfakes and harmful interactions — especially for vulnerable groups like children and seniors — make state and local legal protections essential.

Ron De Jesus, a field chief privacy officer at the software company Transcend, said AI risks and states’ rights are what makes state lawsuits against this order imminent.

“Colorado’s Attorney General already committed to challenging the order in court, and California officials have voiced their intent to challenge too,” De Jesus said. “I’m worried about the precedents this sets outside of AI as well. If this becomes the playbook for dismantling other state laws, that would impact data privacy regulations, another area where we do not have a federal standard in place. If threatening federal funding becomes a viable tactic to undermine state authority on tech regulation, that’s a much bigger problem than just AI governance.”

‘Trust in AI’

Cody Venzke, senior policy counsel with the American Civil Liberties Union, said it’s common sense that state and local protections should not be removed until a federal back-up is in place.

Advertisement

“It’s fundamental that AI will not improve people’s lives if we cannot believe that it is safe, trustworthy and nondiscriminatory,” Venzke said. “And the state laws that we are seeing be proposed and emerge are focused on that very thing, whether it be ensuring that AI used for employment purposes really does pick the best person for the job or that AI to decide who gets a mortgage, isn’t picking up on hallucinatory outcomes.

“And so all these state laws are dedicated to making sure that people can trust in AI, and right now, what the administration is doing is attacking states for taking those basic efforts to protect their citizens with no federal proposal in sight.”

But the cost of state AI laws on small businesses, said Michael Richards, executive director of policy at the U.S. Chamber of Commerce’s Chamber Technology Engagement Center, will stifle profits and undermine the nation’s innovation and leadership. Richards noted that the order does direct the creation of a national framework, and added that the small- and medium-sized businesses his center represents want to develop it alongside governments.

“There is a full understanding here that Congress needs to work and develop something here,” he said. “The Chamber has been stating this for some time, that this needs to an ongoing effort.”

But, Venzke said, it’s common for states to innovate on regulations around civil rights, privacy and technology before a federal proposal is on the books. This well-established practice has a number of predecessors, he said.

Advertisement

“States were the first to establish health privacy laws, and the federal government then followed suit when it saw the need and saw states working in this area,” Venzke said. “Notably, [the Health Insurance Portability and Accountability Act of 1996] does not preempt state laws. It sets a floor, not a ceiling, and we’ve similarly seen this the Communications Act and other technology-related laws. All allow states to have a place to govern, and it’s seen as a cooperative effort between the states and the federal government, not one that preempts states entirely from the field.”

Latest Podcasts