Advertisement

Government is ‘missing the boat’ on understanding outcome data

The chief data strategist for California’s social services agency said government collects a lot of data, but it isn’t always informative.
woman standing on top of bar graph
(Getty Images)

Government agencies can’t merely log administrative data if they hope to effectively provide the public with services and benefits — they must measure program outcomes and understand how those outcomes arise, a panel of government and nonprofit leaders said during an online event Wednesday hosted by Code for America.

Natasha Nicolai, the chief data strategist for the California Department of Social Services, said government agencies are frequently “missing the boat” on efforts at continuous quality improvement because they’re not measuring the right things.

“Oftentimes what we see governments measuring is program administration, transaction and transactional data, checkboxes, and sometimes if we’re lucky, actual process or, at the most basic level, engagement, but certainly very rarely outcomes,” Nicolai said.

California has been working on its own iterative data framework since at least 2017. Nicolai said the reason it’s taking so long to develop is partially because the state is bogged down by old processes that were designed to keep people out of benefits programs. That goes against the mission of groups like Code for America, which intends to build a government that “reduces poverty, advances equity and truly serves the American people,” Lou Moore, the organization’s chief technology officer, said during the panel.

Advertisement

Understanding ‘why’

Lynn Overmann, a senior adviser for data and technology at the U.S. Department of Agriculture, said her “dream-state” for government data includes numerous measurements that aren’t well understood today.

“We need to know who our eligible recipients are,” she said. “How are we currently trying to reach those people and is that outreach effective? What are the pain points in our process? Where are people falling out in the process? We tend to create fairly lengthy processes that require a lot of proof. Who’s coming in? How far are they getting? Where are they falling out of the process and what can we learn from that so that we can improve accessibility?”

Since joining the Biden administration two months ago, Overmann said she’s also learned of the relevance of program retention: When people drop out of programs, it’s not clear if the cause was an improvement in life circumstances and the benefit was no longer needed or a program’s reverification requirements had become too much of a burden for that person.

Tara Dawson McGuinness, of the New America think tank, which is focused on improving families’ economic security, said that when using data it’s important to keep an eye on the purpose of the project, or “Data for what?” McGuinness, who recently co-authored a book about the abundance of IT and data available to government, said one salient example outlined in her book was how a nonprofit called Community Solutions tracks homeless populations in Rockford, Illinois, and dozens of other communities. In Rockford, she said, the group aided the government in piecing together a list of its approximately 100 homeless residents, a task she said required a lot of interagency collaboration. 

Advertisement

“Instead of having different systems telling them the rate of homelessness or who is in the hospital or who is showing up in the shelter, they really work to create data that sees the whole people,” McGuinness said. “They have good controls over how you do that in a way that respects privacy, but they also really understand that one person’s challenge with being housed is not the same as the 100th person.”

Many programs rely heavily on federal data, she said, but those data sets aren’t necessarily helpful for towns like Rockford and groups like Community Solutions, which aims to eliminate homelessness.

“I think [Rockford’s project] a really great example of how real-time action at the local level is modeling something we could be thinking about federally,” she said. “It’s an inspiration. It’s data-dependent, but it isn’t simply Big Data. It’s about really understanding the individuals and doing some hands-on engagement with the folks that they’re serving.”

Qualitative data

Trooper Sanders, the chief executive of Benefits Data Trust, a Philadelphia nonprofit that connects people to public benefits programs, said he’s a big fan of “benefits matching” — using data of people already receiving one benefit to find other support they may be eligible to receive. But it’s important, he said, to monitor the process so there’s also data on which practices are most effective at making those matches stick.

Advertisement

“Was it the language that was used in a particular piece of outreach? Was it the method?” Sanders said. “And then also looking at the broader ecosystem.”

In Philadelphia, that means identifying and working with community-based organizations that have credibility with the communities of color that his organization seeks to serve, he said.

Like the other speakers, Sanders also backed the idea of a more thorough understanding of the machinery that produces various outcomes, which he labeled as gathering “qualitative data.”

“We can really glean insights from people and families by talking to them, by asking them, and one of the pieces around equity, it’s going to be in the eye of the beholder and in the communities, and do they have the experience that is dignified, that is easy and that is relevant?” he said. “And I think from mixing both the quantitative and the qualitative, we can figure out what are some of the big leadership decisions and policy decisions that we need to make, but then also some of the administrative tweaks that are required.”

Colin Wood

Written by Colin Wood

Colin Wood is the editor in chief of StateScoop and EdScoop. He's reported on government information technology policy for more than a decade, on topics including cybersecurity, IT governance and public safety.

Latest Podcasts