open source software discussion in Gov, to include: - Free, libre, open source software - Creation of OSS by the gov - open standards
Open Source Software and NIST
July 21, 2009 at 6:42 pm #76191
IMO INTERESTING blog posting where the author was attempting to address the cloud issue in the Federal arena and and spent almost as much time discussing Open Source Software, especially as it relates to the NIST mindset
END OF COMMENTARY
U.S. Agencies Think About Establishing Cloud Nodes
Posted by Charles Babcock, Jul 17, 2009 06:15 PM
Tim Grance, program manager for cyber and network security at National Institute of Standards, says standards are essential to cloud computing. And among those standards must be additional standards for moving virtual machines from cloud to cloud, something we still lack.
Grance is one of the federal government's thought leaders on cloud computing. He and Peter Mell, chief scientist at NIST, together author papers on what cloud computing is about and what it requires. A number of federal agencies, such as the Defense Information Systems Agency, NASA and the General Services Administration, are beginning to think in terms of each on owning a node in the cloud, built to a common standard, which would become part of a general cloud infrastructure--Uncle Sam's Cirrus or Cirrocumulus.
This actually a brilliant idea, one that may anticipate where computing is going and provide a model for it ahead of what any private enterprise could do by itself. The Federal government spends $70-$80 billion on IT each year, says Grance. It should do so in a way that best serves its citizens. He is not advocating that the federal government impose cloud standards on the rest of the economy--quite the contrary.
Rather, he thinks federal agencies, such as NIST, must steer a debate over a minimal, reference cloud implementation, and let critics fight it out over the effectiveness of that implementation. If a better cloud architecture emerges, it will speed cloud adoption throughout the economy.
"One of our key metaphors around here is a reference implementation--create a common vocabulary around a reference implementation," he said. If the reference implementation proves itself and passes muster, then federal agencies could proceed, each with its own full implementation. The reference implementation would assure that what worked in one agency's cloud would probably work in another's, even if all the details weren't the same.
Somewhere down the road, the whole federal Cirrocumulus would be greater than the sum of its parts.
Grance declined to take too many sharply defined stands on what a reference implementation should look like, as if he had lots of experience with well meaning, government scientists colliding with the buzz saw of private commercial interests. It will be through a partnership of public and private interests that the federal government gets into cloud computing, although he acknowledged $70-$80 billion of IT spending lining up behind the same standards and options helps insure those alternatives get produced.
In addition, he doesn't see either an open source cloud or a heavily proprietary model. "We embrace use of both proprietary and open source software. We try not to take a public stand on which is better," he said.
"We certainly support and appreciate the use of open source code," he said, but lest he veer too close to the edge of the road, he added, "We also think people have to make money."
"People should not be driven by ideology but by what is the best, straight-forward business case for doing things a certain way. Can I deliver a service better to citizens this way?" he said.
Within a set of emerging cloud standards will have to be additional standards in the field of virtualization. Grance cited the DMTF's Open Virtualization Format or OVF as a first step in that direction. When a workload is cast in the OVF format, it can be sent over the Internet to any cloud running one of the major hypervisors. VMware's ESX Server, Microsoft's Hyper-V and Citrix Systems' XenServer all recognize OVF and convert an incoming workload into their own preferred file format to be run under their own hypervisor.
Establishing OVF was a big step forward by competing vendors, but it left open the question of how a user would extract a workload from a cloud, now in its preferred hypervisor's format, and ship to another cloud, where a different hypervisor will refuse to recognize it. In other words, the vendors are not volunteering to convert the workload back into OVF or better yet, a neutral runtime format. Why would they do that? That would make it easy for the customer to move to a competitor?
Grance isn't saying the federal government will address this problem if commercial interests don't--you can almost hear that saw spinning in the background. But he does say, "The future of cloud computing is much brighter if we embrace a certain amount of interoperability, both through APIs and virtualization."
Ease of movement between clouds might create some limited vendor exposure, but it would also create the opportunity of speeding cloud adoption and use. Instead of worrying about encroachment, why not "grow the total pie?" he asks.
"One of our important charges is to enable portability. Users have got to be able to migrate that data around. At the same time (the setting of standards) has to be done in a constructive way in a dialogue between the government, business and customers," he said.
It's just as well the government can't dictate a set of cloud standards. Who is to know if it would get it right the first time? Probably not.
But a reference implementation of a cloud node that the government plans to build as a first step toward a federal cloud--that would speak volumes about what's needed to succeed in future cloud computing. Such a reference implementation, one way or the other, is likely to include a way to both import and export of virtual machines from the cloud.
Vendors with good ideas on how to do that should not be keeping their light under a bushel. If for some reason they feel compelled to hold back, there's little reason for NIST and other government agencies to do so as well. In the free enterprise economy, users will be able to figure out what's good enough for Defense Information Systems might be good enough for their information systems.
You must be logged in to reply to this topic.