Advertisement

Federal government outpacing state, local agencies on AI adoption, survey finds

A new survey by Ernst & Young shows that state and local government agencies are lagging behind the federal government on AI adoption.
(Getty Images)

State and local government agencies lag behind their federal counterparts in the use, training, policy and adoption of artificial intelligence tools, according to a report published Wednesday by the professional services firm Ernst & Young.

The report shows that the top barriers to adopting AI in state and local agencies are unclear governance and ethical frameworks, the lack of technology infrastructure and funding, and AI applications not aligning with current agency needs.

The survey found that 59% of total respondents reporting that they have access to AI tools provided by their agencies. Forty-eight percent of state and local agencies use AI tools daily, compared with 64% of federal agencies.

Amy Jones, a public sector AI lead at EY, pointed out that federal agencies, such as the Department of Energy and Department of Defense, are responsible for an entire country, which creates a sense of urgency to more quickly adopt emerging technologies.

Advertisement

“State and local generally have concerns over data privacy and the use of AI systems that are potentially biased in the way that they create outcomes in the community, so we’re seeing a lot more hesitation in the state and local area,” Jones said in an interview with StateScoop. “I think that they’re feeling, from a regulatory perspective, maybe one or two steps behind those mission-focused areas on the federal side.”

The survey found that the primary obstacle to AI’s implementation across state and local agencies is the absence of clear governance and ethical guidelines. Twenty-two percent of state and local respondents said their agencies don’t have AI-use policies, compared to just 7% of federal agencies.

Jones said federal, state and local agencies face the same obstacles when using AI, like the risk of bias or inequitable outcomes.

“All of these are weighted models. They are applications of the algorithms behind them and their weights. Weights are inherently biased, whether it be because of the training data or because of the way that the model was built,” she said. “So the goal is not to make an unbiased model. It’s to make a model where the outcomes are equitable and drive better access.”

Jones said each agency defines bias differently based on its focus, a sticking point federal agencies can push past using national technology labs and federal funding, but a challenge that often stalls state and local agencies, which may not have access to those resources.

Advertisement

“For instance, a department focuses on health and human services might have a very different definition of bias than a department of justice,” Jones continued, “because their data is different, because their models are different, but they’re all going to be bound, I think, by some baseline ethical standard.”

Many states have established AI task forces, coalitions of state officials, technology experts, academic researchers, and community leaders to research how AI can assist or harm government operations. Many others have established new offices, hired officials and established policies centered on AI.

The federal government is doing similar work, including last October when President Joe Biden issued an executive order that outlined eight guiding principles and priorities for AI use. 

Jones said frameworks for ethical AI use are only beneficial if public sector employees have the skills and knowledge to understand them, however.

The EY survey found that state and local agencies are less likely to understand their agencies’ AI policies compared to federal agencies, because they lag behind on training and workforce. Only 48% of state and local agency respondents said they value AI experience in job candidates, compared to 70% in the federal government. State and local agency are also behind on AI training, where 39% said they have never offered AI training, compared with 28% overall.

Advertisement

Jones said state and local governments must establish AI governance and ethical standards and train their employees to speed up AI adoption.

“My recommendation would be to define your why and have a clear and concise set of definitions for the types of technology, capabilities, responsibility, and governance that you intend, and then prioritize your investments,” Jones said.

Latest Podcasts