,

How to Solve the Cyber Data Conundrum

An interview with John Harmon, Regional Vice President of Federal Cyber Solutions, Elastic

Agencies have so much data that they can use to detect hidden cyber threats — if only they can figure out how to use it without breaking their budgets.

The Biden administration’s “Executive Order on Improving the Nation’s Cybersecurity” highlights the value of data. It directs agencies to maintain network and system logs and other data relevant to threat and vulnerability analysis, and to deploy endpoint detection and response (EDR) systems.

The challenge is the sheer volume of data involved, said John Harmon at Elastic, which provides a free and open platform for search, observability and security. “If they wanted to keep all the data that’s in these new requirements and put it into an actionable format, it could literally take their entire budget just to fulfill these requirements,” he said.

The Value of Cyber Telemetry

For a long time, cyber data largely was an untapped resource.

Every system on the network — from end-user devices and servers to routers and switches — creates a digital trail of its round-the-clock activity. But most agencies use only small slices of the available data, with security products providing different slices. They are missing the big picture.

Consider the 2020 SolarWinds incident, in which an attack went on for months before it was detected. If agencies had been collecting and analyzing data from network and system logs and EDR systems, they would have been able to detect the unusual data exfiltration activity or other tell-tale signs of the attack.

How to Tame the Data

The EO seeks to ensure that agencies capture the necessary data, but that still leaves agencies to figure out how to manage it. Although agencies might want to keep the most recent data on hand, they struggle with how to manage the rest of it.

Elastic has developed an approach based on related security-specific solutions and its Elasticsearch platform, which is widely used in government. The idea is to ingest all that data, create a searchable snapshot and store it in low-cost, cloud-native data storage — what Elastic calls frozen tier storage.

The data is indexed before going into storage, which makes it cost-effective and easy to search and retrieve as needed. A local cache of recently queried data is maintained for faster repeat searches, but the size of the cache needs to be only a fraction of the data stored in the frozen tier.

Further, the agency saves money by keeping it in low-cost storage (e.g., Amazon S3, Azure Blob and MinIO), rather than managing it in the data center. It takes longer to retrieve data from the frozen tier than it does from local servers, but it is still just a matter of minutes — the time it takes to get a cup of coffee, said Harmon.

With this approach, “agencies can meet the letter of the law of the executive order, have all that data and still have it actionable,” he said.

This article appears in our guide “Bright Ideas for Making Cyber Stick.” To see more about how agencies are implementing cybersecurity, download the guide.

Photo by Mart Production on pexels.com

Leave a Comment

Leave a comment

Leave a Reply