The following is an interview with Chris Steel, Chief Solutions Architect at Software AG Government Solutions. To learn more about how your agency can excel with big data, be sure to check out our guide: The Big Data Playbook for Government.
With big data and analytics programs, agencies can reveal new insights and improve their decision-making process in ways they were unable to before. But to unlock these new insights from information, government agencies must first understand how to fully leverage all their high value and authoritative data. This means that in order to truly create a data-driven agency, the way government stores, accesses and processes data must change.
“We see a lot of data within agencies stored across different information silos,” said Christopher Steel, Chief Solutions Architect at Software AG. “One of the biggest hurdles to getting all of the data together is the fact that it’s often stored in different formats.”
Steel explained that there are two common data categories that agencies collect: static and streaming. Static data refers to information that is stored in a traditional database or data warehouse or Hadoop and analysis is run on the information. Streaming data refers to real-time data, with information coming from sensors, devices or social media. In order to maximize the most benefits, agencies must be able to run analysis on both static and streaming data.
“Organizations should be able to take data that’s in one form and convert it to a form that’s needed, and share that data in real-time,” said Steel. “This empowers leaders to make decisions they previously weren’t able to make. This entire process can be done in real-time, whereas before it was either not possible, because there was just no mechanism to get the data into the format needed, or it would take a relatively long amount of time.”
Government agencies must have the ability to extract value from static data. This data might be stored in a system like Hadoop. But to get a full understanding of an issue, static data should merge with real-time data to for a complete analysis. “If you have a Hadoop solution where you’re running analysis on an hourly, daily, weekly basis, what Software AG can bring to the table is the ability to augment those analysis with real-time data. We can color that data with contextual information from real-time feeds,” said Steel.
Even when bringing information together, there is still the need to put real-time data into the proper context to measure against historical trends. For instance, transactional systems host troves of historical data, but now are also producing volumes of real-time data. When real-time data is measured against historical trends, organizations can quickly spot abnormalities, and may be able to thwart waste, fraud and abuse –in some cases, before it even happens.
Another example of why it’s important to blend historical and real-time data comes from when a cyberattack occurs. When this happens, agencies are flooded with data from log files. Often, much of this data is irrelevant, so an agency needs an effective way to sift through the data to quickly identify the abnormality in the log file and provide insights into the breach.
“With real-time big data analytics, agencies can pick out the needles in the haystack as data is streaming by, and they don’t have to worry about taking al of the irrelevant data, putting it into a store, running an analysis, waiting, and then redoing that over and over again,” said Steel.
“Instead, in real-time, the agency can have the ability to pull information out and make a decision while they still have a window of relevance, which allows them to act on the data. Once the agency sees that the attacker has breached their systems, they can go in and take proactive action to stop the hacker from stealing or corrupting any of data.”
Added Steel, “In the past, agencies were behind the curve. They might be able to eventually figure out that a breach had occurred from the the information, but by that time, their data was already stolen or corrupted.”
But by exploring both the historical and streaming data, agencies can gain the ability to analyze data to optimize operations, mitigate risks, and make decisions in real-time.
“For over a decade, Software AG has been working with agencies to help break down their information silos,” said Steel. “We do that through a variety of different products that allow public sector agencies to integrate, transform and make data more accessible and faster to help reduce costs.”
For more information, be sure to check out our guide: The Big Data Playbook for Government.
Leave a Reply
You must be logged in to post a comment.