GovLoop

Accelerating Insights with Multi-Tiered Flash Storage

orbital in the Telecommunication room

This blog post is an excerpt from GovLoop’s recent guide, How You Can Use Data Analytics to Change Government. Download the full guide here.

At its core, big data analytics is about converting data into knowledge that can be used to make informed decisions, which can have a variety of positive implications to government operations.

For instance, big data is being used in cyber defense to accelerate recognition of patterns that represent threats to the network. These insights are enabling organizations of all types to predict and prevent potential cyberattacks before they occur, said Jason Strawderman, Vice President of Federal at Tegile, a California-based company that specializes in next-generation multi-tiered flash storage systems.

“Government healthcare organizations are using big data to manage, analyze, visualize, and extract information that is helping prevent and fight disease,” Strawderman said. “While the benefits of big data in our government are tremendous, the implementation is proving very challenging as legacy infrastructure cannot handle the demanding workload, and disparate systems prevent the sharing of information. Big data is forcing our government to modernize their IT infrastructure and transform culture, policies and pro- cesses. These challenges are not just limited to the government; many organizations in the private sector face these issues as well,” Strawderman said.

With this transformation comes the need for a different way of storing and managing big data. Agencies must break down legacy storage infrastructure silos while improving performance and increasing capacity, said Chris Tsilipounidakis, Manager of Product Marketing at Tegile.

Customers are moving away from the idea of purpose-built storage, where they use different vendors to store different types of structured and unstructured data, Tsilipounidakis explained. In the past, organizations typically used certain vendors to store mission-critical data on more expensive, high-powered storage systems, other vendors to store less important data on lower-cost storage, while using more general-purpose systems for workloads that require an even blend of performance, capacity and economics.

“Big data analytics is causing a massive hyper-consolidation in the datacenter,” he explained. “Hyper-consolidation means that if I’ve got less money, less resources, but more requirements to do the same thing that I was doing before, I need to do it in a consolidated fashion.”

Traditionally, legacy storage systems populated with a mix of solid-state and hard-disk drives have been the go-to for agencies’ storage needs. But to help agencies better meet the workload demands of big data analytics, Tegile is working with them to adopt multi-tiered flash storage.

“With a multi-tiered flash-storage architecture, I can tell CXO’s, Program Managers, and IT Managers that I can give them far better performance than they are getting from their legacy storage systems and at a fraction of the cost. In fact, our new High Density Flash arrays offer flash performance at the same cost per gigabyte of traditional hard drives,” Tsilipounidakis said.

The Tegile multi-tiered flash architecture uses different layers of flash to meet agencies’ varying needs, whether it’s managing structured or unstructured data. This approach provides a flexible, extensible, scalable, affordable, and energy-efficient architecture that can accommodate change rapidly. With dense flash, agencies can get better performance and more data into a smaller footprint.

“Our new high-density flash array, IntelliFlash HD, can pack up to four petabytes of raw flash in a single 42U rack, at near the cost per gigabyte of traditional hard drives, and incredibly power efficient,” said Strawderman.

In terms of data management, security is also top of mind for agencies. That’s why Tegile uses self-encrypting flash drives that are FIPS 140-2 validated, meaning they meet U.S. federal standards for encrypting data at rest and in transit.

“Tegile can adapt to fit any storage environment and workload,” Strawderman said. “Most government agencies are migrating to — or already operating — hybrid cloud architectures consisting of both private cloud and traditional IT. We can interoperate in those existing environments, without them having to rip and replace all the legacy technology.”

When it comes to choosing cloud or traditional IT, it isn’t an all-or-nothing decision, Strawderman added.

However, one area where a model like cloud can be less viable is when agencies want to process data in real-time and take quick actions, as opposed to reviewing large amounts of historical data for less time-sensitive analyses, Tsilipounidakis noted. That’s why agencies must first evaluate their needs and see where different technologies can support those objectives.

Strawderman is an advocate for the so-called “land and expand approach,” where agencies can start by using flash storage for their big data needs and over time expand flash to other workloads, such as virtualization, virtual desktop infrastructure and archives and backup, as older legacy systems are retired and refreshed.

“Nobody wants to replace their entire infrastructure, and no one has the budget for that,” Strawderman said. “But big data analytics, server consolidations, desktop virtualization and mobility projects are a few great opportunities to introduce flash into their environment.”

Exit mobile version