Newsom vetoes AI safety bill aimed at companion chatbots

California Gov. Gavin Newsom on Monday vetoed a bill that would have tightly restricted minors’ access to companion chatbots, artificial intelligence-powered systems designed to simulate human-like relationships and provide emotional support. The governor said the legislation went too far and could have effectively banned most chatbot use for young people.
Introduced by State Sen. Steve Padilla, the bill would have required companies running companion chatbots to avoid using addictive tricks and unpredictable rewards. They would also have been required to remind users at the start of the interaction and every three hours that they’re talking to a machine, not a person.
“While I strongly support the author’s goal of establishing necessary safeguards for the safe use of AI by minors, AB 1064 imposes such broad restrictions on the use of conversational AI tools that it may unintentionally lead to a total ban on the use of these products by minors,” the governor wrote in a message to state lawmakers.
The legislation was inspired by recent news stories of teens who committed suicide after forming unhealthy relationships with companion chatbots. One was Sewell Setzer III, a 14-year-old Florida boy who took his life last year. His mother, Megan Garcia, said Setzer used a chatbot created on a platform called Character.AI and had expressed to the bot that he was considering suicide.
A March study by the MIT Media Lab examining the relationship between AI chatbots and loneliness found that higher daily usage correlated with increased loneliness, dependence and “problematic” use, a term that researchers used to characterize addiction to using chatbots. The study revealed that companion chatbots can be more addictive than social media, due to their ability to figure out what users want to hear and provide that feedback.
“We’re sorely disappointed that comprehensive protections for California’s children remain incomplete. As children move from social media to AI, we must ensure AI is safe for our kids and not a suicide coach that can kill them,” State Rep. Rebecca Bauer-Kahan said in a statement about the legislation. “It is incomprehensible that we do not hold Big Tech responsible for the devastating harms their platforms have caused.”
The veto is seen by some proponents of the bill as a win for the tech lobby, who fiercely campaigned that it would stifle innovation. In July, an executive at TechNet, a statewide network of technology CEOs, drafted an open letter opposing the legislation, claiming its definition of a companion chatbot is too broad and that the annual reporting requirements would be too costly.
“1064 had the potential to save children’s lives and we’re deeply disappointed that he vetoed it, “Danny Weiss, chief advocacy officer at Common Sense Media, said in an interview. “We understand that he was under tremendous pressure from the tech lobby, and we understand that he had a lot of different tech bills to evaluate, but we’re disappointed that he didn’t sign this one.”
Newsom signed another bill into law this week aimed at making artificial intelligence safer and more transparent for users, especially minors.
The legislation, SB 243, requires AI chatbots to clearly disclose that they’re not human and to follow safety rules when interacting with people who show signs of mental distress or suicidal thoughts. In addition, starting in 2026, chatbot operators must report how they handle these sensitive situations.
Though the law is part of a broader package of measures meant to strengthen online protections for children, including stricter controls on deepfakes, new age-verification requirements for apps and clearer labeling around social media risks, Weiss said it doesn’t go far enough.
“[Under AB 1064], if the AI companion, promoted self harm, suicide, aviation, disordered eating, illegal activities, if it promoted those things, the company could not distribute those products to kids,” Weiss said. “It seems extremely reasonable, and the bill that he did sign doesn’t prevent that from happening.”