Want generative AI? Just ‘pilot, scale, adopt,’ says McKinsey
As governments race to integrate generative artificial intelligence tools into their operations and digital services, a report published Wednesday by the consulting firm McKinsey offers states a framework it calls “Pilot, Scale, Adopt.”
The report provides a template states can use over a six-month period to effectively implement generative AI in statewide data governance practices. It includes tasks for governor’s offices, executive agencies and top technology officials.
According to the report, the governor’s office should be responsible for creating the adoption road map and governance, such as creating AI task forces to identify the risks and benefits of generative AI, which dozens of states have established through legislation or executive orders.
Trey Childress, one of the report’s authors, told StateScoop that identifying the risks of generative AI is an important step to adoption, but risk shouldn’t stop governments from experimenting.
“I think we are just seeing a lot of interest, enthusiasm there. And it’s the duality of [being] a little afraid of the risks, but also wanting to get started,” Childress said. “And you can actually address those in tandem.”
According to the framework, state agencies are responsible for assessing their technology and data needs and identifying areas of their department or IT infrastructure that could benefit from generative AI.
After assessments are completed, state chief information officers can hire new staff or train current staff on the updated systems and test generative AI in the following areas: operational efficiency, digital service experiences and practical insights through content summary.
“What we’re seeing in practice is a duality of being mindful of the risks, but the opportunities that are ahead and being agile enough across the state government to think about these over the near term horizon all the way into the intermediate term [and] the next year,” Childress said of the report’s recommended framework.
The report also says that states understand the difference between traditional AI, which many governments currently use to analyze structured data and make future predictions, and generative AI, which can produce new content, such as text or images, through unstructured data and language prompts, which, Childress said, makes the path to generative AI adoption a lot faster.
“You do see a lot of governments using traditional AI to find statistical inferences on things around traffic patterns or to track pollution patterns, for example, that help governments make better decisions,” he said. “What you’re beginning to see in generative AI adoption is the use of the technology to support humans in the work that they do.”
The report also says it’s also important for states to dispel their misconceptions about the generative AI, which can create unnecessary roadblocks, such as thinking these tools lie somewhere in the “distant future” or that updating legacy IT infrastructure is not worth the investment, when, Childress said, many generative AI tools are actually deployable with minimal changes to governments’ current infrastructure.
“It’ll always be evolving and should be evolving, but to get it up and running can be a pretty short timeline, compared to other more traditional AI that requires you to get your database installed, and files and data cleaned up, and so on,” Childress said.