Protecting Your Data in a Hybrid Cloud Environment

This blog post is an excerpt from GovLoop’s recent guide, Forecasting the Cloud: Eight Ways the Technology is Changing Government. To download the full guide, head here.

Today’s IT organizations are all under incredible pressure to respond to a variety of different business and citizen needs – and as a result there’s hope that moving towards a hybrid delivery model for cloud will enable them to be more responsive. Ideally, this hybrid cloud solution can simultaneously help reduce costs and provide more flexibility to public sector IT.

But at the same time, there is a lot of concern around security and maintaining stewardship of the data that lives in a hybrid cloud. To discuss some best practices for maintaining visibility into what is going on with your data when operating in a hybrid environment, GovLoop sat down with Alan Dare, District Manger, USPS Cloud, NetApp.

“Organizations need to feel confident that they can integrate in the public cloud and not introduce additional risk,” Dare said. “Many of the traditional datacenter environments enable these IT organizations to control every single aspect of their data, from the performance, security, the protection and all the governance associated with that.”

But as public sector IT reaches out to use the public cloud to augment their portfolio of services, their data is becoming more mobile. Therefore, it’s exceptionally critical for the IT industry to be able to control the data that you store in hybrid cloud setting, Dare explained.

“One of the things NetApp has created is a data fabric to help our customers try to utilize the promise that the hybrid cloud brings, while also maintaining stewardship of the data,” Dare said. “We developed a NetApp private storage (NPS), which is a way for storage to be owned by the customer, yet have the data exposed through hyper scalers. And what it allows the customer to do is take advantage of the compute and software features of these hyper scalers. But the organization’s data doesn’t reside inside the public cloud. It actually is beside the public cloud, and only in the control of the parent organization.

What this enables an organization to do, Dare said, is fully take advantage of the cost model that hyper scalers offer, as well as maintain the steward and control of the dataset.

For organizations that are looking to continue to improve best practices around data storage and control, Dare had one main piece of advice.

“The first step to the cloud is that all organizations need to enable a data stewardship policy around their data to correctly identify what data can and cannot reside inside the public cloud, and under what restrictions data can be accessed within the public cloud,” Dare said.

Having a set policy before starting the journey to utilizing the hybrid cloud environment helps an organization’s IT staff target candidate datasets and processes that can easily make the transition to cloud, and others that can’t, or require higher levels of security or review.

“Many organizations fail to create data policies before jumping to the cloud, which causes delays in implementation or fractured cloud implementation,” Dare said. “An organization’s data policy should be the first step in contemplating utilizing a hybrid cloud solution.”

Finally, Dare said that another movement gaining ground for utilizing cloud in many public sector IT departments is using cloud as disaster recovering and continuing operations.

The cloud can really help both organizations that either have no disaster recovery backup plans, or are really advanced in their strategy,” Dare said. “For organizations that have limited disaster recovery, the cloud offers the ability to provide offsite backup, without a substantial capex investment. Datasets can be mirrored offsite and tested in the cloud.”

Dare went on to explain how NetApp can help with disaster recovery via use of their application called SteelStore.

“We can drop SteelStore into a customer’s site, and without any significant changes, they can do a quick backup, and the device will transparently compress, deduce, encrypt the data, and push it off into the cloud for them,” Dare said. “This means if you’ve got data that’s on premise in its physical NetApp device, we can mirror it into the cloud. So if you have a disaster, you can sail over into the cloud and have the same protocols and services that you would on site.”

In short, NetApp is building a data fabric around the hybrid cloud to enable customers to move the data back and forth in any situation. By focusing on data stewardship and control of data, the public sector can succeed in a hybrid cloud environment.

netapp_logo

Leave a Comment

Leave a comment

Leave a Reply