Advertisement

California’s building a federated wildfire data hub backed by UC San Diego supercomputers

The California Wildfire Commons is bridging data silos that have long hindered coordinated wildfire response.
Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
UC San Diego
(UC San Diego)

The Supercomputer Center at the University of California San Diego last month powered on a new data hub called the California Wildfire Commons, playing a central role in helping the state modernize how it shares and uses wildfire data.

Built with the California Wildfire Task Force, a group established in 2021 as part of Gov. Gavin Newsom’s Wildfire and Forest Resilience Action Plan, the hub aims to break down long-standing data silos that have hindered coordinated wildfire response. Ilkay Altintas, the center’s chief data science officer, said the university’s supercomputer provides high-performance computing, cloud infrastructure and data management that allow the new platform to integrate datasets, models and tools across 19 state agencies. The data focuses on vegetation treatments and wildfire resilience metrics.

“Typically, these type of data are in individual silos and it’s hard to bring them together when you’re trying to use them,” Altintas said in an interview. The commons “brings together … a capability to use all of these datasets without a massive effort.”

California’s wildfire data has long been fragmented across systems like fuels databases, weather feeds, incident tracking tools and regional resource kits. Altintas explained that the supercomputer catalogs data using standardized metadata and tagging systems, enabling interoperability between datasets that were previously incompatible. The hub is federated, meaning it presents a unified view of the state’s many wildfire data sources without moving or copying them.

Advertisement

The federated commons allows data to remain with its original owner while still being searchable and usable by other agencies through a shared interface. Systems like vegetation mapping tools, fire behavior models and incident tracking platforms all connect to a unified environment.

Altintas said the platform operates as a distributed data ecosystem rather than a single centralized database.

“We bridge the data to compute resources so when you search and access data, you actually can use them in [one] place, rather than moving data around, or moving small amounts of data and bringing them together in a compute platform, so that what we refer to is analysis ready. Sandboxing is a part of the goal here,” Altintas said.

Forest Schafer, deputy director of the California Wildfire and Forest Resilience Task Force, which focuses on mitigation and resilience, said the shift to more data-driven emergency management is majorly impacting operations, particularly as California wildfires grow more complex and destructive.

Using the commons, Schafer said, agencies can more easily coordinate fuel treatments, share situational awareness during active fires and align long-term resilience planning. Instead of relying on disconnected systems, decision-makers gain access to near real-time, multi-source intelligence that improves forecasting and response.

Advertisement

“All these different datasets are being used by regional and local planners to plan their projects into one place in the areas of greatest hazard,” Schafer said. “They’re able to move the needle on wildfire risks to areas throughout the state.”

In addition to the commons, Schafer said the task force also created an interagency treatment dashboard, which tracks how the state is creating more defensible space by thinning vegetation, removing ladder fuels and using fire-resistant plants to reduce wildfire risk in rural areas.

He said the task force aims to treat one million acres annually, and plans to regularly update the dashboard using data collected by LIDAR.

“We’re really proud that we can report on all of the acres that are treated throughout the state and really be able to show the impact that’s being made. But we want to shift towards reporting on the outcomes of that work,” Schafer said. “There was 700,000 acres treated in a given year, but what did treating those acres actually accomplish?”

Sophia Fox-Sowell

Written by Sophia Fox-Sowell

Sophia Fox-Sowell reports on artificial intelligence, cybersecurity and government regulation for StateScoop. She was previously a multimedia producer for CNET, where her coverage focused on private sector innovation in food production, climate change and space through podcasts and video content. She earned her bachelor’s in anthropology at Wagner College and master’s in media innovation from Northeastern University.

Latest Podcasts