NOAA Takes Next Steps Towards Virtulized Data Centers

Originally posted on Federal Technology Insider:

The National Oceanic and Atmospheric Administration (NOAA) is laying the groundwork to take virtualization to the next level with plans to create a virtual data center that provisions compute and storage capabilities using a combination of private and public cloud computing resources.

The goal is coupled with NOAA’s objective to consolidate more than 100 data centers, in part to meet the government’s broader Federal Data Center Consolidation Initiative, but also as a necessary step to meet future demands for data.

To get there, “NOAA has embarked on a long-term plan to reduce over 150 designated data centers to a much smaller number of core facilities by 2018 that will make up a private cloud,” said NOAA Deputy CIO Zachary Goldstein.

That number does not include 121 data facilities located at Weather Forecasting Offices, or at satellite downlink stations around the world, which technically meet the Office of Management and Budget’s definition of a data center, but will remain apart from NOAA’s consolidation initiative for now. Goldstein noted that NOAA “can’t just go in and take those out (and consolidate them) because they have complex backup issues and have systems that tie them together.”

At the same time, NOAA is in the process of acquiring public cloud services as an initial step that will reduce the need for NOAA-dedicated capacity, Goldstein said. NOAA is looking to issue a cloud computing services award this fiscal year for NOAALink Public Cloud, which is expected to provide existing NOAA users with one or more “ready-for-use” cloud services landing pads, according a recent presentation by Tim Howard, NOAA’s FDDCI project leader.

Together, NOAA is moving into the forefront of agencies who are adopting a cloud service broker strategy that creates a virtual data center through a combination of private and public cloud services, including application and data hosting, and centralized provisioning. The approach not only will help streamline provisioning but also make the true costs on enterprise computing more visible.

Virtualization will continue to play a pivotal role in NOAA’s efforts to provide enterprise computing, storage, archiving, and related infrastructure services, as well as efforts to virtualize legacy applications and services to its customers. While NOAA is perhaps best known for the National Weather Service, it also supports the National Environmental Satellite, Data, and Information Service, the National Marine Fisheries Service, the National Ocean Service, theOffice of Oceanic and Atmospheric Researchand a vast network of users who depend on NOAA’s data. And the amount of data used for modeling and other work has grown enormously.

Collectively, the volume of data stored across NOAA’s data centers (excluding backup data) is expected to exceed 18,350 terabytes this year. That figure is expected to triple by 2018 and grow by another 50%, to more than 90,000 terabytes, by 2020, according to NOAA projections.

Besides the data, however, are a vast range of legacy operating systems – more than 15,120. By NOAA’s count, about 8.1% of them are classified as virtual operating systems – and is currently averaging 2.19 virtual operating systems per virtual host.

“A private cloud is more than virtualization,” however, cautioned Gartner analyst, David Cleary, speaking at the Federal Cloud Computing Summit in Washington in May.

“Virtualization is an important enabler, but it is the higher level services,” such as access management, service management, and service optimization which are part of managing a cloud platform and resource management that IT departments really need to be thinking about.

“You in IT,” said Cleary, referring to agency IT leaders, must recognize the emerging need to become “a cloud services broker to your stakeholders,” adding the IT departments in the future will increasingly need to focus on managing hybrid clouds, and finding ways to extend security, management, and governance capabilities across a combination of private and public clouds that inevitably will handle agency IT workloads.

That’s a future toward which NOAA has already gained experience.

NOAA was among the first federal agencies, for instance, to move all of its email networks to the cloud, consolidating 19 email systems and moving 25,000 email accounts to the cloud in December 2011.

NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL), meanwhile, has been working with CSC and SGI to develop a tiered virtualization storage solution to address GFDL’s massive data storage requirements. GFDL, which models atmospheric, oceanic, and climate behavior, routinely moves 100 terabytes of data per day or about 30 petabytes of data per year, and is considered one of the world’s most active data repositories. With that volume expected to triple in the next three years, virtualization is expected to be a critical part of the agency’s efforts to keep up with future demands.

Leave a Comment

Leave a comment

Leave a Reply