The government has no shortage of data. In fact, the government is literally swimming in terabytes of information. But simply amassing information isn’t enough. To drive decision making, the data must be actionable, accurate and presented in a way that makes it meaningful. But here’s the issue: figuring out what data is actually actionable is quite challenging.
One of the reasons that’s so hard is governments have to sift through tons of data to find the useful pieces. That can seem impossible with so much information flying around. But with the right tools, employees can access the right data, at the right time, to make thoughtful, data-driven decisions.
In this week’s Government Innovators Summit at GovLoop, we heard from a panel of experts as they discussed the power of data analytics to improve all areas of government – from social services, to public safety, to cybersecurity efforts. The panel included Mike Wons, Chief Technology Officer State of Illinois; Andy Bailey Technical Lead for Big Data, NOAA; and Alan Ford, Director of Government Presales, Teradata Government Systems.
Andy Bailey kicked things off by discussing what his agency, NOAA, is doing with data. “NOAA has undertaken an experiment in making as open and successful as possible,” he explained.
NOAA’s data are increasingly popular and valuable, he said, so and under the current scheme, NOAA struggles to keep up with public demand – budgets for capacity and security aren’t keeping pace with data access costs. At the same time, NOAA wants to learn about solutions while also promoting use, democratizing data access, facilitating research and enabling new economic opportunities for partners.
So NOAA launched their Big Data Project. As Bailey mentioned, NOAA generates massive amounts of data a day from satellites, radars, ships, weather models, and other sources. While these data are open to the public, it can be difficult to download and work with such high volumes.
The NOAA Big Data Project (BDP) was created to explore the potential benefits of storing copies of key observations and model outputs in the cloud to allow computing directly on the data without requiring further distribution. Their goal is the hope that such an approach could help form new lines of business and economic growth while making NOAA’s data more easily accessible to the American public.
“We are aiming to make data easily accessible and comprehensible, even if you’re not a data science expert,” Bailey said.
Over in Illinois, Mike Wons, Chief Technology Officer of the state, talked about the great things their government is doing with their data as well. They’ve undergone a transformation in the past few years, including establishing a statewide data analytics practice, hiring the state’s first chief data officer, as well as the state’s first data scientist. They also delivered a program called Ed360, which works to provide state data across silos regarding student performance. Finally, they executed a framework that enables agencies across the state of Illinois to share their data.
“There is no shortage of data, just a shortage of actionable insights from data,” Wons pointed out. So in Illinois their goal is to make that data easy to use, access, and interpret.
When looking to the future, Ford predicted that data analytics and cloud will make a strong pairing for government. Governments will benefit from the reduced costs enabled by such technologies, so agencies can do more with less.
This big leap to cloud and data analytics, however, may seem improbable for government agencies with legacy systems. When agencies have spent a tremendous amount of money acquiring systems over time, they are reluctant to replace these familiar platforms. That’s where Ford says “the Holy Grail of analytics” – the unified data architecture – can come in handy. “This is an ecosystem which allows for the harnessing of all of an agency’s data,” he said.
Ford predicted that states will make the most important gains by just getting all their data in a single analytical ecosystem.
In short, though, all three agreed that the future of data analytics does not have to be daunting. With the right architecture and policies, government can be ready for advanced analytical capabilities and harness large influxes of complex data to further agency initiatives.
This blog post is a recap of a session from GovLoop’s recent Government Innovators Virtual Summit. For more coverage, head here.