Advertisement

Only California state employees can use Poppy, a new AI assistant designed for security

Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
Poppies
Poppies bloom in Chino Hills State Park, foothills of the Santa Ana Mountains in Southern California, on April 8, 2023. (Allen J. Schaben / Los Angeles Times via Getty Images)

California technology officials said a new AI-powered digital assistant, called Poppy, is helping employees speed through bureaucratic hurdles within a secure ecosystem that keeps government data on official networks.

Poppy, named after the state’s official flower, is a generative artificial intelligence tool designed by the state to be used only by state workers. It operates on a secure, state-government network, and it only pulls information from official CA.gov websites, to ensure sensitive data is not compromised.

Poppy launched in September as a pilot project, aiming to improve data-sharing and collaboration across agencies.

Johnathan Porat, chief technology officer at the California Department of Technology, said Poppy has more than 2,600 users across 66 departments. Under the pilot, Poppy is offered at no cost to participating departments, but is limited to 100 users per agency.

Advertisement

Porat said the pilot has received positive reviews from users, for enhancing their efficiency and confidence. Legal teams use it for policy analysis, human resources officials use it for succession planning and others use it for filling out state forms.

“So the guiding principle behind Poppy has been: for state workers, by state workers, built by us here at the state. Everything is built around the idea that a state worker is using it,” Porat said in an interview. “It’s great that not only can we have an AI tool that’s this accessible, but it’s an AI tool that’s secure and really grounded and built off of state data.”

Porat said Poppy launched in response to Gov. Gavin Newsom’s 2023 executive order, which tasked state agencies with creating risk-assessment reports for how generative AI could affect their work, California’s economy and the state’s energy usage.

Unlike public AI services, Poppy operates entirely on California’s internal infrastructure, meaning queries, documents and responses never leave the state’s trusted environment, officials said.

Shera Mui, the technology bureau’s deputy director of platform services, said that closed architecture ensures sensitive policy details, internal procedures and compliance data stay within government control. As a result, Mui said, Poppy is helping bridge the gap between siloed data and collaborative government work, helping departments operate more efficiently while maintaining tight control over how information is accessed and used.

Advertisement

“It’s really building those knowledge repositories and how that information is dispersed across different areas within a program or multiple programs. No matter if you’re an analyst or manager, it gives you information in plain language,” Mui said in an interview. “So I think that’s been really helpful, and it’s helped bridge also some of the gap between those levels, which I’ve seen firsthand.”

Mui said the tool will flag any data containing personally identifiable information. Poppy may refuse to complete tasks or redact the sensitive information.

“If you put in a prompt and you have a Social Security number, it’ll say, You can’t use this data. Or if you ask a personal question related to an individual, it will say, I’m sorry, I cannot answer that question,” Mui explained. “We also want [Poppy] to be fair and equitable. We don’t want it to make assumptions about people. … It’s out of scope of what this is.”

Though Porat and Mui said Poppy has so far been a success, they have encountered users complaining the tool does not perform functions outside of the state government infrastructure. In response, after the pilot ends in June, the state plans to open Poppy to additional users and provide additional training to ensure such AI tools are used responsibly.

“The biggest part that we understand that we’re going to have to really address is really focusing on user expectations and training,” Porat said. “There are a lot of people who are going to Poppy and asking questions that might be better fit for a search engine.”

Latest Podcasts