There is so much data available today, and it’s in many, many different formats. From text files to gargantuan databases, webpages to streaming video, we’re inundated with data. Remember when it seemed impossible to fill every row in an Excel worksheet? Those days are gone.
There is an upside to data overload. Agencies rely on data to make both short- and long-term decisions that affect all of us. As citizens, we benefit by having FEMA rush to our aid after a disaster strikes, or by receiving state funding based on accurate citizenship records. The performance of many government employees is assessed based on personnel metrics, which smooths out some bias, and the military is able to make better tactical decisions by using data analysis.
And those are great examples of being data-driven. This term describes the process of making organizational decisions based on data rather than gut instincts or observation. In other words, you draw facts from data, and then use those facts to help make decisions. Many agencies are either data-driven now or heading in that direction.
Even though agencies understand the value of data, it’s still hard to manage. Not only does it come in many different formats, but it is also often not standardized. You can’t just combine data from this source with data from that source. There’s a fairly significant effort involved in cleaning up data from different sources for it to make sense when analyzed together. For data, standardization is unification.
The Open, Public, Electronic and Necessary Government Data Act of 2018 (or OPEN Government Data Act) requires agencies to develop and maintain data inventories. But that doesn’t mean a catalog of all data sets will be available soon – it’s a huge job to prep and standardize all that data.
Some agencies have adopted a collection of tools to address specific data management issues, like performing backups and replicating data, among others. Over time, those agencies struggle with too many data management tools, each with its own interface, features and way of accomplishing tasks. More is usually not better.
Imagine an agency that has watched its data and workloads grow significantly in recent years. To meet its mission, the agency must be able to get insights from its data quickly, but it still uses fixed storage and mostly legacy systems, which slow down the process. It is modernizing its infrastructure but the pace is slow. The agency feels pressured by the growing demand for information from citizens and other agencies as well.
What the agency needs is technology that helps them get quicker access to and quicker insights from the data they rely on to make decisions. That is, technology that helps the agency get to the “decision point” faster. That’s where a data-centric architecture comes into play.
With your data in the cloud, managed by a single set of tools and highly accessible, you eliminate data silos. Not only is the cloud designed for sharing data, you can choose a hybrid cloud model to split your data across a public cloud and a private cloud. For example, connect an application in the public cloud with your on-premises data storage. Because storage is optimized for hybrid cloud environments, you and your users get the best of both worlds.