,

How to Turn Operational Noise into Actionable Insights

Government IT operations are becoming increasingly complex. As helpful as advancements like mobility, apps, virtualization, cloud and software-defined models can be in government, they also contribute to a highly complex IT environment.

These systems need their own resources in order for agencies to monitor them and make sure they’re running smoothly.

In addition, current troubleshooting and monitoring solutions are often disparate point tools, each monitoring a piece of the puzzle, but not seeing the system as a whole. This gap in data visibility creates more work for agencies, leaving little time for organizational learning or efficient improvements. Not to mention IT staffs are also tasked with less efficient activities like simply keeping the lights one.

So how can this be improved? In GovLoop’s Government Innovator’s Virtual Summit, govies learned how agencies can take the first step toward operational intelligence and integrate data from all sources into an easily searchable repository. To gain more in-depth understanding, experts, Ted Okada, Chief Technology Officer of the Federal Emergency Management Agency (FEMA) and Bill Babilon, Operations Specialist at Splunk, Public Sector, shared their insights as well as some helpful case studies.

Making Machine Data Accessible, Usable and Valuable to Everyone
Most agencies tend to compartmentalize their data and IT structures. “Instead of using a holistic framework, organizations are much more compartmentalized where people deal only with their areas of expertise,” Babilon said. “This makes IT much more complex and siloed.”

Not only does IT become increasingly complex, but without a holistic view of the data, agencies are reactive to issues rather than proactive. “That means agencies are spending over 80 percent of their time just keeping the lights on rather than other important mission priorities, like customer service,” Babilon said.

So how can machine data help? Unlike traditional structured data or multi-dimensional data, machine data is non-standard, highly diverse, dynamic and high volume. Machine data is digital information created by the activity of computers, mobile phones, embedded systems and other networked devices.

Take the example of purchasing a product on your tablet or smartphone: the purchase transaction fails, you call the call center and then tweet about your experience. All these events are captured – as they occur – in the machine data generated by the different systems supporting these different interactions.

“If you can correlate and visualize related events across these disparate sources, you can build a picture of activity, behavior and experience,” Babilon said. “And what if you can do all of this in real-time? Then, you can respond more quickly to events that matter.”

For example, if an agency captured a citizen’s twitter ID in their customer profile, this would help the agency correlate the profile with other relevant information. That way, the agency can build a more holistic picture of the citizen as a customer. Such data is not only helpful for improved customer service, but agencies can even extrapolate this example to a wide range of use cases such as security and fraud, transaction monitoring and analysis, web analytics and IT operations.

Using Actionable Insights for Natural Disasters
Using data and IT in a more holistic manner can not only improve agency outcomes in customer service and other mission-critical areas: it can even save lives. A senior executive managing a natural disaster, for example, can use machine data to get a real-time and accurate view of their community’s needs. “That senior executive can track sentiment analysis on social media platforms, like Facebook and Twitter,” Babilon said. “Then correlate that information with other field reports received from field staff. You can then determine who needs certain types of assistance and in which areas.”

In fact, FEMA is using similar tactics with machine data and actionable intelligence for natural disasters. “We have to make the data usable across a pretty complex mission space,” Okada said. “Especially natural disasters, like hurricanes and floods.”

One of FEMA’s more recent projects is OpenFEMA, a program aimed to deliver mission data to the public in machine readable formats. “Our focus will always be disaster survival, first responders and citizens,” Okada said.

The goal of OpenFEMA is to release agency data to the public in machine readable formats on fema.gov and data.gov. Additionally, Okada and his team hope that the program will help the agency engage with external partners to leverage FEMA’s data in order to improve outcomes in support of survivors and first responders.

Ultimately, the experts agreed that while IT is becoming increasingly complex and government data only continues to increase in volume, it is possible to turn all the noise into actionable insights. Using machine data and actionable intelligence, agencies can harness their data more efficiently, better meet their mission-critical needs and better serve the public.

This blog post is a recap of a session from GovLoop’s recent Government Innovators Virtual Summit. For more coverage, head here.

 

Leave a Comment

Leave a comment

Leave a Reply