A New Way to Think About Managing Data

Any agency undertaking an enterprise-level data initiative is likely to experience some serious growing pains.

Let’s say an agency’s security operations center wants to use analytics to improve its ability to detect and respond to threats throughout its environment. The problem is the necessary data is scattered across the organization and stored in siloed systems both on premises and in the cloud or, more likely, in a multi-cloud environment.

Legacy data management solutions simply were not designed with that kind of scenario in mind, leaving data specialists stuck with laborious manual processes.

“The way that we have many of our systems today, making that data available to our government analysts can be very cumbersome,” said Chris Townsend, Vice President for Public Sector at Elastic, which helps agencies find the answers they need from data in real time and at scale.

A Data Lake Reality Check

Some organizations try to alleviate their data complexity by dumping all their data into a central repository, such as a data lake. In theory, it makes sense to manage all that data from a single source, but Townsend doesn’t expect many agencies to choose this approach in the long term.

One problem is scale. With enterprise-level initiatives, too much data is spread around too far to make a central repository a practical solution. But even if it’s doable, a data lake doesn’t fit with how most agencies want to operate, Townsend said. They’re going to continue to maintain some of their data on premises and some on their various cloud platforms.

“We need solutions that are more flexible in allowing access to all that information in a very fast, efficient and cost-effective manner that’s operationally sound,” he said. “That’s where things are going.”

A Different Tack

An increasingly popular alternative is to leave that data in place, but to index it using a common platform that makes the data readily accessible and findable. This is the approach that Elastic takes.

This decentralized architecture — sometimes referred to as a data mesh — makes it possible to handle growing volumes of data without running into performance problems.

The Department of Homeland Security has deployed the Elasticsearch platform as part of its Continuous Diagnostics and Mitigation (CDM) program. Elasticsearch is the foundation for the CDM Dashboard II, which enables cyber teams at the Cybersecurity and Infrastructure Security Agency to search cyber data in hundreds of federal agencies as part of threat-hunting efforts.

Without this distributed architecture, such a sweeping data effort would be very costly and complex, Townsend said. Elasticsearch allows the data to remain in its original location with each agency, which is also a benefit for security and compliance.

The Elasticsearch platform can also help agencies with a variety of use cases, such as enterprise performance monitoring, real-time situational awareness, investigative intelligence, AI insights and more.

“We’re deployed in some of the largest government agencies at scale, because this is a very flexible, efficient way to index and ingest the data,” he said.

This article appears in our guide, “5 Habits of Highly Productive Agencies.” To learn more about how organizations can promote and maintain a productive environment, download the guide.

 

Image by Gerd Altmann from Pixabay

Leave a Comment

Leave a comment

Leave a Reply