It’s no secret that the amount of data in the world is growing. One study predicts there will be up to 44 zettabytes of data by the end of this decade alone. That’s nearly as many data points as there are stars in the sky. So how does government get a handle on this ever-growing data?
For many resource-constrained government agencies, a storage strategy to manage that expansive information needs to be put in place before the data management challenge detracts from the mission at hand.
To help the public sector better understand how to extract value from big data, GovLoop partnered with Quantum, a leading expert in scale-out storage, archive and data protection. This industry perspective takes a look at the rapidly growing use of big data, as well as storage strategies that can help government agencies better manage data as they capitalize on newfound insights.
GovLoop spoke with Claire Giordano, Senior Director of Emerging Storage Markets at Quantum, who explained the rise in big data and what government organizations can do to stay focused on the value offered by this wealth of information.
Multi-tier storage — a solution where data can be stored on different types or “tiers” of storage based on the needs of the application and the workflow — is an approach being used more and more in big data environments. The flexibility to automate the movement of data between types of storage such as flash, high-performance disk, object storage, tape archives and cloud can be a key enabler to unlocking the value in big data, Giordano said.