GovLoop

Leveraging Virtual Environments and Cloud for Data Migrations

Close-up of young businessman typing something while sitting at the desk in creative office

This articles is an excerpt from GovLoop’s recent guide,”IT Modernization: How Government Does IT.” Download the full guide here

It’s no secret that public sector data is growing at exponential rates. But with governments at all levels under pressure to do more with less, many agencies are looking for ways to optimize their IT infrastructure to meet their growing storage needs.

Two driving advances in IT today are virtualization and cloud computing. Virtualization uses software to create multiple, independent environments from a single, physical hardware system, allowing organizations to optimize the use of existing computing capacity and save costs by requiring less physical hardware. Cloud computing lets different departments or agencies access shared computing resources as a service and on-demand through the internet.

In an interview with GovLoop, Paul Nguyen, Sales Engineer for DLT Solutions, discussed how virtualization and cloud computing can help organizations make the most of their current resources and the importance of a data assessment before implementing either of these solutions. DLT Solutions is a government technology provider offering a range of software and cloud solutions.

Most IT leaders recognize cloud computing as crucial to their IT modernization efforts due to its scalability, flexibility, and performance. Additionally, it provides significant cost savings, allowing agencies to only pay for what they use, and helps organizations continuously evolve its IT to meet ever-changing business needs.

But agencies won’t reap these benefits if they don’t strategically identify data before migrating it to the cloud. “There are several factors that go into moving data to the cloud,” said Nguyen. “You have to consider the time sensitive nature of the move, the amount of data and the complexity of IT services that are being migrated.”

For agencies who need to maintain greater control of their IT environment, but still want the flexibility and cost savings of cloud, virtualization provides an alternative, and often times complementary solution. Traditionally, when agencies needed more storage they had to “scale up” and purchase more memory and hardware. But with virtualized environments they can “scale out” and distribute resources from a virtual storage repository as needed. Virtual environments are much more cost effective and increase the efficiency and utilization of existing hardware.

Virtualization and software-defined solutions also allow for streamlined communication between different vendor solutions, so agencies don’t experience vendor lock-in. “Say a customer has all of their storage from one particular industry vendor,” said Nguyen. “With a virtual environment, they don’t have to stick with only that vendor. They can buy another industry vendor’s solution and then they can work together as one in a software- defined storage pool.”

While cloud and virtual environments allow organizations to scale their resources as needed to meet demand, it’s critical that agencies avoid treating them as a data “dumping ground” but instead perform a data assessment to ensure they’re only storing data that’s of real value.

DLT and Veritas, a market leader in data management, are two providers who work with agencies to gain an understanding of what data exists in their environment. “We can look at your storage and tell where data flows, who has access to it, how long it’s been in your storage environment, and whether it’s obsolete,” Nguyen said. “We then can set up retention plans and storage lifecycle policies, so you can move data around from on premise to the cloud and vice versa”. Once the assessment is complete, an agency can reduce their storage needs by getting rid of the data they don’t need and determining the best storage option for their mission-critical data.

For organizations ready to move away from their legacy IT systems, Nguyen advised that the first step is get to know your data. “Make sure you test it before you actually migrate so there’s no down time, no corruption, no loss of data, and you know ahead of time that it’s going to work, before you actually go live,” said Nguyen.

Today’s rapid pace of data growth provides agencies with unprecedented insight into the needs of the citizens they serve and the missions they need to meet. With more data, however, comes new challenges around organizing and storing it all. By migrating from legacy IT systems to more streamlined virtual environments, and implementing a data management strategy and accountability around your data, government can generate the information insights they need without exponentially increasing data storage and cost.

Exit mobile version