, ,

Weaving Disconnected Sources Into a Data Fabric

Government agencies have turned their attention to limiting the trafficking of opioids and prescription drugs — an issue known as drug diversion. For such a pressing national problem, it seems like cities, states and federal agencies would be working in chorus.

But as those familiar with public sector data will recognize, helpful information is often inaccessible, practically speaking, because it’s dispersed across data sources.

“Agencies are getting only a small, narrow view of the information they need to tackle the problem of drug diversion,” said Stephen Moore, Chief Technology Officer at AlphaSix, a service provider that focuses on the convergence of big data and cybersecurity.

One emerging solution is to create a cohesive data framework for integrating and analyzing data, so GovLoop asked Moore for a few tips on how to do so. Just discovering use cases, such as applying big data to the opioid crisis, is a good start, he said.

Build a Data Analytics Framework

What single sort of infrastructure or policy can support a rapidly changing field like data? The short answer: There is none.

“There isn’t a big data silver bullet,” Moore said. “You have to tie together the infrastructure, systems and security with a flexible analytical framework.”

What Moore speaks about – the infrastructure, systems and security on top of a flexible analytical framework – is known as a data fabric. A data fabric is a series of services and systems that connects data and coordinates its management across an enterprise. AlphaSix has a data analytics framework with a combination of underlying technologies that lets agencies intake and analyze broad amounts of data.

If that sounds like a big commitment, it is. Moore said a data fabric requires support from the top brass, who suffuse it throughout an enterprise.

Collect and Load Data Sustainably

Data collection is not a one-time occurrence. Agencies collect data on a regular basis, and how they do is always changing – as are the data types they collect.

Some data is structured, some is unstructured, and some is a mix of the two. Without a data analytics framework, all of this information stays in its respective pods. But after stitching a data fabric together, these data sources begin to interact in a clean, structured pool that experts can draw from.

Resolve Analysis Into Insights

Once big data is clean and compiled in a central framework, agencies get to the fun part. They can now layer on machine learning, visualization tools or queries and filters.

At this stage, agencies really start to solve the use cases they identified out front. The analysts spot trends and pinpoint anomalies.

In the example of drug diversion above, they could pick out doctors who prescribe inordinate amounts of opioids and identify neighborhoods with higher rates of use.

“Even more important is for agencies to see the big picture of what big data can do,” Moore said.

This article is an excerpt from GovLoop’s recent guide, “Your Data in the Year of Everything Else: A GovLoop Guide.” Download the full guide here.

Leave a Comment

Leave a comment

Leave a Reply