, ,

Data-Driven Decisions at the Speed of Events

The challenge federal agencies face with data isn’t just one of constantly exploding volume. It’s also one of velocity — the ability to ingest, analyze and draw value from that data in a relative blink of the eye. Too often, however, they’re working with systems and tools designed for a smaller scale and more deliberate pace of operations.

Agencies looking to access and analyze the data they collect face several built-in hurdles that block the path to turning that data into actionable information while complying with security and privacy requirements.

1. Disparate technologies and architectures. The mounting volume of data is coming in multiple formats from a growing number of sources, including cloud applications and IoT sensors. Agencies’ data stores often are siloed, incompatible and lack the automation required for easy retrieval.

Legacy systems were never designed with this kind of data in mind, said Srini Srinivasan, Chief Product Officer and Founder of Aerospike. “This nonlinearity is not something these products were designed to handle,” Srinivasan said.

2. Proprietary systems and solutions. Legacy systems aren’t equipped for real-time data sharing because of incompatible, proprietary formats and an inability to accommodate advanced technologies such as artificial intelligence (AI). This works against the goal of using data in a timely fashion. “Many of these decisions have to be made in some level of real-time service-level agreement,” Srinivasan said. “There is typically somebody who is waiting for a response.”

3. Organizational silos. Agencies have different structures and missions, which can slow or prevent information sharing, even with the latest technology tools.

4. Dispersed geographic locations. Agencies’ reach, whether across the country or around the globe, throws other hurdles in the way. For example, emergency response or military missions could be in remote areas. And mobility further complicates the decision-making process since it may depend on the location of a user — or multiple users — on the move.

Solution: Tools for Putting Data into Action

With the right platform, agencies can ingest, analyze and make use of data in real time, and do it at any scale and with constant uptime. Among the tools that can make this possible:

  • A flash-optimized storage layer, treating flash as memory, which provides accelerated input/output operations, higher performance and small-footprint scalability.
  • Storage indices in dynamic random-access memory, with data on optimized solid-state drives, which provides predictable performance at any scale.
  • A multithreaded, massively parallel design for scaling up or out, adding that capacity while using as few nodes as possible.
  • Self-healing clusters, providing a single hop to data along with superior uptime, availability and reliability.

These elements can create a hybrid architecture that can produce five key outcomes:

Ingesting and analyzing big data. A distributed system that incorporates flash storage with the latest hardware technology, the cloud and extreme-scale applications can bring immediate access to data, no matter the source or location.

Centralizing systems in real time. A platform that combines systems and scales on the fly can put control of the data into a central panel. It also can incorporate all of an agency’s historical data, putting everything at your fingertips.

Integrating modern applications with legacy systems. A flexible system that can incorporate input from different software utilities, such as Hadoop or Spark, can also make use of machine learning (ML) and AI to, for example, track a cyberthreat in real time.

Delivering new services in near-real time. A real-time platform can deliver new services highly efficiently. For example, it could allow IT teams to implement in an hour a new service that otherwise would take a day to deploy.

Lower total cost of ownership (TCO). An edge-to-core platform that delivers five-nines uptime and predictable high performance at any scale while maintaining a small network footprint can also substantially lower the TCO, enabling the lowest possible spending on capital and operational expenditures.

This article is an excerpt from GovLoop’s report “How to Manage the Crunch of Real-Time Data-Intensive Workloads.” Download the full report here.

Leave a Comment

Leave a comment

Leave a Reply