GovLoop

Extracting Knowledge From Big Data

Business People Using Computer Working Concept

Big data is a key strategic asset for government agencies. Solving the kinds of complex problems that government deals with requires deep analysis across multiple large-scale datasets, often collected over long periods.

But these valuable datasets and the systems that store them are often decentralized and disconnected. Analyzed on its own, one dataset may not reveal the entire story or lead to flawed conclusions. As long as data is kept apart in silos, it is difficult to look across datasets for changes, patterns, anomalies and trends. As a result, useful data goes unused and problems go unsolved.

With so much data to deal with, it can be challenging for agencies to keep up. Government employees spend much of their time gathering, structuring and cleaning up data, leaving little capacity for analysis and decision-making.

Stephen Moore, Chief Technology Officer at AlphaSix, likens this to the Pareto principle, or the 80/20 rule. “Imagine an agency that spends 80% of its time preparing data and just 20% using that data to find insights,” Moore said. “That’s not the balance you want. By automating the bulk of the data preparation, we can flip the 80/20 and spend more time prioritizing analytics.”

How agencies manage their data adds to the challenge. It’s not unheard of for separate teams to collect, maintain and secure data, while data scientists — scattered agencywide — perform independent analyses with multiple tools for different purposes. Technology can forge connections between these capabilities to make big data accessible and analyzable, but deployment comes with complexity.

“There isn’t a big data silver bullet,” Moore said. “There isn’t one tool. You have to tie together the infrastructure, systems and security with a flexible analytical framework — creating a ‘data fabric.’ But even more important is for agencies to see the big picture of what big data can do.”

The Solution: An Adaptable Framework for Data Insights

To approach big data in a way that leads to more insight, Moore said agencies first need to commit to a data fabric concept. It’s a commitment that must come from the highest leaders — people who can shift culture and change mindsets to embrace big data capabilities agencywide. It’s a vision that supports investment in big data analysis as an effective means for propelling an agency’s mission forward.

“The key is for agencies to work out their use cases for big data,” Moore said. “A use case is the problem you’re trying to solve, the question you’re trying to answer, the need you’re trying to meet and what people want to do with the data. Then you’re ready to establish an architecture for how big data analysis can reveal relevant information about those specific use cases.”

An adaptable underlying analytic framework is the starting point for extracting more value from big data today and preparing for a data-driven future.

“You don’t have to lock data into black box systems or rebuild how you collect, process and store data for every new visualization tool you want to use,” Moore said. Agencies can break out of the cycle of redoing and upgrading their big data systems every few years. Data will change over time, but the analytical framework you use can evolve and grow as needed.

This article is an excerpt from GovLoop’s recent report, “Big Data: Accelerating Time to Mission-Critical Insights.” Download the full report here.

Exit mobile version