CTO Commentary of Intel Sections of Senate Report 112-73: National Defense Authorization Act for FY2013

Congress has posted an online report with a long title of:


Section C of this report is titled:


This section is particularly good reading this year. I copy that report below (from the Library of Congress Thomas site for the report), and then add in a bit of in-line commentary. Look for my comments in the form of [CTO Note: ______]


Authority to provide geospatial intelligence support to security alliances and international and regional organizations (sec. 921)

The committee recommends a provision that would extend the authority of the National Geospatial Intelligence Agency to provide imagery intelligence and geospatial information support to security alliances and international organizations of which the United States is a member, and to regional organizations with defense or security components. This provision was requested by the administration.

Army Distributed Common Ground System (sec. 922)

The committee recommends a provision that would direct the Secretary of the Army to assign oversight of the DCGS-Army cloud acquisition effort to the Army’s Chief Information Officer (CIO)/G-6. The provision would require the CIO to conduct an audit of the program and provide an assessment and recommendations to the Secretary of the Army and Chief of Staff of the Army by December 1, 2012.

[CTO Note: There are some really good people in Army CIO, and no one in DCGS-A should fear an audit like this. ]

In mid-2010, the Deputy Chief of Staff for Intelligence for U.S. Forces-Afghanistan (USFOR-A), signed a Joint Urgent Operational Needs Statement (JUONS) requesting the rapid fielding to all echelons in Afghanistan of a commercially available analyst toolkit, which he had evaluated, that he believed would dramatically improve intelligence analysis in theater.

[CTO Note: This is a famous story to insiders. One of the greatest intelligence heroes in the community saw something that looked incredibly cool, but there were many others who held different views. I certainly don’t want to weigh in on that fight, but I will observe that honest patriots in uniform in the Army disagree strongly over what to do here. ]

The Army at the time was accelerating the development, with the National Security Agency, of a cloud-computing system for the Distributed Common Ground System (DCGS) that the Army G-2 believed would transform intelligence analysis capabilities. Up to this point, the Army had been seriously considering integrating the commercial toolset requested in the USFOR-A JUONS with the DCGS cloud system. The thought was that, while the Army cloud was rapidly maturing the ability to ingest, correlate, store, and distribute vast amounts of disparate data, the effort to produce new types of analyst tools for data query, manipulation, and display was lagging. Commercial analyst tools, in contrast, provided rich analyst support, but were perceived to lack the large-scale data handling ability of cloud architectures.

[CTO Note: I also heard this story, that the toolset picked did not handle large quantities of data and could not do analysis over data sizes needed. ]

Ultimately, however, the Army decided not to support the USFOR-A JUONS by acquiring the requested commercial toolset, despite its ability to complement the Army cloud program. Instead, the Army asserted that its own analyst tool development program would be ready as fast as the proposed commercial product deployment, and would provide equal capabilities. This decision generated much controversy within the Department of Defense, and concern in Congress, but the Army was given the opportunity to prove that it could deliver the promised capabilities. In the 2 years that have ensued, the Army periodically re-examined the option of integrating multiple commercial front-end analyst tools (such as Analyst Notebook, Palantir, Centrifuge, Semantica, etc.) into its cloud architecture, but has always elected to stick with its internal development.

[CTO Note: The US Army has some very hard things to do when it comes to IT, and few commercial tools will be up to the task, in my opinion. Those that are picked need to have maximum interoperability and need to scale beyond the size that commerical firms have their tools scaling too. And the Army needs things done for a reasonable price. By the way, a tools left off this list is Thetus and their Savanna tool. Also, Recorded Future has an awesome visualization capability that has just blown me away and it now needs to be viewed as a candidate for this sort of capability, in my view.]

Meanwhile, the Marine Corps and even some Army units in Afghanistan proceeded to deploy commercial products. Overall, the feedback from these units and an independent assessment by the Deputy Secretary of Defense-chartered National Assessment Group has been very positive on these commercial products. Unfortunately, the Army cloud’s analyst support appears to continue to lag behind promised performance. In testimony to Congress in late 2011, the Army indicated that only 115 analysts in Afghanistan are using the Army’s DCGS cloud analyst tools, despite years of development and considerable expense.

[CTO Note: I love the Marine Corps. But I would never suggest that the IT they pick for their missions should be the IT picked for Army missions.]

The committee lacks confidence that the three groups trying to jointly manage the Army’s DCGS modernization–the G-2′s office, the Intelligence and Security Command, and the Intelligence and Information Warfare Directorate of the Communications-Electronics Research and Development Center–are going to deliver a fully capable, end-to-end system to support the warfighter on an acceptable schedule and cost.

[CTO Note: Maybe aligning organizational authorities would be the best approach, but maybe not. Reorganizing might just slow down fielding of capabilities. ]

Rationalization of cyber networks and cyber personnel of the Department of Defense (sec. 923)

The committee recommends a provision that would require network consolidation and re-design to free up personnel to achieve an appropriate balance between U.S. Cyber Command’s mission capabilities. In the event that the rate at which personnel freed up from network consolidation is insufficient, or if the personnel available are not able to meet the requirements for supporting Cyber Command’s offensive missions, the provision would require the Secretary of Defense to take appropriate action to provide qualified personnel in the required timeframe.

[CTO Note: Budgets are being reduced and reportedly force size is being reduced. So makes sense that networks can be consolidated. Technology has advanced to the point where there are many secure technologies that can help get this done ]

General Alexander, the Commander of U.S. Cyber Command, in speeches, testimony to the committee, and within the Department of Defense (DOD) has declared that DOD networks are not defensible due to the proliferation of sub-networks, each with its own security barriers, which prevents visibility and control by commanders. Although the committee cannot substantiate the claim that there are `15,000′ such sub-networks, there is no dispute that there are far too many such enclaves with features that today hinder rather than promote information security.

[CTO Note: I haven’t counted the networks in DoD, but 15,000 sounds about right. It was around 10,000 when I was at JTF-CND in 1998, so 15,000 sounds right. ]

General Alexander’s testimony also confirmed that the personnel assigned to Cyber Command and its components are overwhelmingly allocated to network management and defense. A small percentage of the workforce attends to the Command’s offensive missions and responsibilities. General Alexander confirmed that this ratio reflects an imbalance in capabilities and must be rectified.

General Alexander and others in DOD agree that both issues could be at least partially rectified by dramatically reducing the number of separate network enclaves in the Department, which should yield significant manpower savings, and re-train and re-assign that manpower to supporting offensive missions.

[CTO Note: This is not a technical issue. THis can be done and some savings generated. ]

In the past, DOD sought to secure information and regulate access to information by controlling access to the network itself. DOD rules encouraged or even required organizations to erect access and security barriers as a condition for connecting to the backbone network. The result is a proliferation of `virtual private networks’ with firewalls and intrusion detection systems, and administrators and analysts to manage and protect them. Desktops and servers behind those barriers are hidden from view and from management. In addition to hampering the work of Cyber Command, this network balkanization makes it hard to share information, to collaborate, and to access common enterprise services.

Network rationalization and the use of identity- and attribute-based access controls should enable improved performance, better security, and more efficient use of personnel.

Next-generation host-based cyber security system for the Department of Defense (sec. 924)

The committee recommends a provision that would require the Department of Defense (DOD) Chief Information Officer and the Under Secretary of Defense for Acquisition, Technology, and Logistics to develop a strategy to acquire next-generation host-based cybersecurity tools and capabilities, and provide that strategy to Congress in conjunction with the budget request for fiscal year 2015.

The DOD is now completing deployment of the Host Based Security System (HBSS) that provides cybersecurity capabilities for millions of endpoint or host computing devices across the Department. This deployment took a long time, cost significantly more than expected, and proved to be a complicated and very difficult undertaking.

This experience has instilled a `never again’ attitude among DOD’s cybersecurity leadership regarding enterprise-scale endpoint security solutions. Instead, DOD appears to be hoping that network-level security enhancements will not only overcome the weaknesses in the current HBSS, but enable the Department to meet future cybersecurity threats. The Commander of U.S. Cyber Command has also publicly characterized endpoint security for today’s `thick-client’ desktop, laptop, and mobile device computers as an exercise in futility on the grounds that it is impossible to coherently manage, monitor, and update millions of such devices distributed across the globe. In addition, the Commander of U.S. Cyber Command points out that the Command currently does not even have visibility or the ability to directly control those endpoints because (as noted elsewhere in this report) they are typically `hidden’ behind enclave firewalls and other security devices. The Commander argues publicly that `endpoint security’ will be viable only in a cloud environment where endpoint computing is virtualized and provisioning and control are centralized and fully automated.

The HBSS system itself is fundamentally based on anti-virus technology, which works only when the signature of an attack is already known. This requires that HBSS-protected computers store a large and ever-growing file of malware signatures. The communications load necessary to keep HBSS up-to-date in the field in some circumstances forces commanders with low communications capacity to choose between operating HBSS and performing their tactical missions, which results in HBSS being turned off.

The committee is concerned that these views are short-sighted from multiple perspectives. Constant, rapid malware morphing is a reality, and the growing use of encryption (for example, the Hypertext Transfer Protocol Secure), suggest that endpoint security solutions will remain essential. Further, industry is now rapidly developing and marketing endpoint security solutions that do not rely on signatures and potentially could vastly reduce the `overhead’ associated with HBSS. Some of these technologies enable `discovery’ of previously unseen cyber threats at the endpoint, which could transform host computers into a much improved sensor grid for the enterprise.

Moreover, as noted elsewhere in this report, DOD must soon rationalize its networks to reduce the number of segmented enclaves, which should extend visibility and easier provisioning and control to endpoints. Finally, although the Commander of U.S. Cyber Command is promoting a fast transition to a cloud-hosting environment, it may take a long time to replace millions and millions of desktops, laptops, mobile devices, and other distributed computers with thin-client devices served exclusively from the cloud.

This provision would require the strategy to address the limitations of the current HBSS system, and to include exploitation of emerging technologies for behavior-based threat detection, insider threat detection, data loss prevention, dynamic encryption of data-at-rest, remediation following infections, and continuous monitoring and configuration management. The provision encourages building on the integration framework featured in HBSS to enable `plug-and-play’ of tools into an open architecture. This capability would permit constant insertion of new, competitively developed tools, as well as tailored or non-standard deployments.

[CTO Note: This approach calls out for the smart application of solutions from four companies I advise: Cloudera, Triumfant, Invincea, Fixmo. The world needs these guys, especially DoD. ]

Improvements of security, quality, and competition in computer software procured by the Department of Defense (sec. 925)

The committee recommends a provision that would mandate multiple actions to improve the security and quality of computer software code, and enhance the ability of the Department of Defense to compete software maintenance and upgrades.

The Department of Defense (DOD) suffers from constant cyber attacks. A significant route for intruders is through vulnerabilities in the software that DOD paid contractors to develop, which often includes integration of custom software code with commercial software packages.

Cybersecurity in important respects begins with secure software. DOD’s software is riddled with common and easily preventable vulnerabilities. DOD spends enormous resources then trying to protect that code from being exploited, discovering when it is compromised, and dealing with the consequences of infections. If the code was written properly in the first place, defending DOD information systems would be much easier.

DOD needs to insist that the code it pays for be secure, and to implement concrete methods and means to achieve that objective. One method for improving code security is to use automated test tools to identify vulnerabilities and to remediate them in development. Most current software code vulnerability testing tools work only on source code. To ensure maximum access to source code for quality and security analysis, DOD needs to exercise its legal rights to technical data.

The government has other strong interests in exercising these data rights, including software re-use and competition for system maintenance, upgrade, and capability insertion. DOD has demonstrated dramatic savings by forcing open complex software-based systems to competition. But achieving success is difficult, and software re-use historically has not been very successful. The committee believes that modern approaches to software repositories, open architecture design, advanced tools for analyzing and understanding software code, improved software development standards and processes, and collaborative software development environments have the potential to significantly improve cybersecurity and increase competition.

This provision would require the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)), in coordination with the Chief Information Officer (CIO) to sponsor a major update and improvement to capability maturity models for development and acquisition. The goal of this effort would be to sharpen the best practices for software development and to provide significantly better means to measure whether organizations are actually implementing the processes and practices at the level expected for the maturity level at which they have been assessed.

The provision also would require USD(AT&L) and the CIO to solidify requirements for secure software development practices and standards. Chief among these is the need to require, rather than just `highly recommend,’ the use of static software analysis tools in the development process, and in evaluating the code in development and operational tests. These tools are widely available commercially, but are rarely used even though they have proven capabilities to detect a large percentage of the most commonly exploited vulnerabilities. It is true that these tools generate a large number of false positives in order to achieve the lowest possible rate of false negatives. But high false positive rates are not a significant problem in the development stage, when the developer can build and test incrementally. U.S. Transportation Command and the Military Health Service already have successful experience with this approach.

The Defense Information Systems Agency’s (DISA’s) Security Technical Implementation Guide (STIG) requires the use of coding standards, but not specifically including secure coding standards, and the references listed in the STIG are best characterized as style guides rather than coding standards. This provision would require that the STIG be modified accordingly, and that appropriate guidance be developed for requiring the use of secure software coding plans in statements of work.

This provision would require the USD(AT&L) and the CIO to develop appropriate guidance for program managers to require evidence that software developers are actually conforming to secure coding standards, best-practice development models, and vulnerability testing requirements, and, in so doing, make appropriate use of software code assessment centers that already exist or could be created in government organizations, federally funded research and development centers (FFRDC), and contractor facilities.

This provision would require investigation of additional means to incentivize the development of secure, high-quality software through the imposition of liabilities, warranties, and liability protections. The committee is aware of commercial software companies that are now offering defect-free warranties for software code bases up to 600,000 lines of code. The professional judgment of the Software Engineering Institute, a FFRDC funded by DOD, is that writing high-quality, secure code should not cost any more than the vulnerable code the government is buying today. It simply requires competence and diligence.

Finally, the provision would require DOD to designate software repositories and collaborative software development environments, such as those existing today in DISA’s Forge.mil and at U.S. Transportation Command. The objective is to expand the use of such facilities to store code that the government owns or to which it has use rights; to enable program offices to use common development infrastructure instead of buying new equipment for every new program; to foster collaboration and the use of proven software development practices; and to enable program managers and prospective bidders to discover code for re-use, to assess its quality and security, or to research code characteristics and functionality to compete on upgrades, capability insertions, and maintenance.

The committee considers it critically important for DOD to provide these repositories’ users with access to advanced software reverse engineering and analysis tools and to static and dynamic vulnerability testing capabilities that work both on source code and code binaries. As noted previously, there are many capable commercial tools available for source code analysis. Industry is working on binary code analysis tools, but these are not yet mature or comprehensive in functionality. NSA has devoted much effort to software reverse engineering capabilities to support its signals intelligence mission that run the gamut from source code to machine code, and from functionality analysis to vulnerability detection. Depending on maturity, scalability, and ease of use, these National Security Agency (NSA) tools could be extremely useful to the DOD acquisition community to improve not only software security, but also software re-use and competition by enabling program managers and prospective bidders to better understand code.

The committee directs the USD(AT&L), the CIO, and the Director of NSA to carefully assess the tools that NSA has and is developing and determine what could be utilized in the DOD acquisition system. The Intelligence Advanced Research Projects Agency is also developing advanced machine code analysis capabilities that DOD should carefully track.

The importance of better automated code analysis to the cyber mission is clear. The Clinton administration’s Critical Infrastructure Protection Plan called for the development of advanced tools for code vulnerability analysis. In 2003, the NSA Information Assurance Director testified before the Subcommittee on Cybersecurity, Science, and Research and Development of the Select Committee on Homeland Security of the House of Representatives that `a significant cybersecurity improvement over the next decade will be found in enhancing our ability to find and eliminate malicious code in large software applications . . . There is little coordinated effort today to develop tools and techniques to examine effectively either source or executable software. I believe that this problem is significant enough to warrant a considerable effort.’

Yet DOD has never invested in tool development. The committee directs the USD(AT&L) to evaluate the state of the art in commercial and government tools for automated software code analysis and reverse engineering, and determine whether DOD should initiate a focused tool development program. The committee requests a briefing on this evaluation at the time of the submission of the budget request for fiscal year 2014.

[CTO Note: I endorse all the above, but we are missing a major point: DoD needs to update its software. Till it does, it has no business complaining about the code they are running. Don’t blame industry if you are running a 14 year old operating system. ]

Competition in connection with Department of Defense data link systems (sec. 926)

The committee recommends a provision that would require the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD)(AT&L)):

(1) To develop an inventory of all data links in use and in development in the Department of Defense;

(2) To conduct a business case analysis of each data link program and make a determination whether there is adequate competition in development, maintenance, upgrade, and new procurement, and if not, whether the program should be opened up to competition;

(3) For each data link program that is identified for increased competition, to develop a plan that addresses how any policy, legal, programmatic, or technical barriers to competition will be overcome; and

(4) For each program where competition is determined to be inadvisable, to prepare a justification for that conclusion.

The committee expects the Under Secretary to develop a working definition of the term `data link systems’, consistent with existing understanding, for the purpose of identifying systems to be included in the inventory required by this provision.

In conducting the business case analyses, this provision would require the USD(AT&L) to solicit the views of industry on the merits and feasibility of opening up programs to competition. This provision would require the USD(AT&L) to provide a report to Congress in conjunction with the submission of the fiscal year 2015 budget request. Finally, the provision would require the Comptroller General to provide an assessment of the report to Congress.

Integration of critical signals intelligence capabilities (sec. 927)

The committee recommends a provision that would require the Director of the Intelligence, Surveillance, and Reconnaissance (ISR) Task Force to develop a plan to integrate multiple technical signals intelligence (SIGINT) capabilities together to satisfy requirements to detect, identify, track, and precisely locate communications equipment from airborne platforms. Multiple program offices in multiple organizations are working on this problem, and each of them has developed technology with separate industry partners that solve a piece–but only a piece–of the overall problem.

One of the programs provides full identification but inaccurate geolocation and no tracking. Another provides partial identification and precise location, but only sporadic tracking. A third approach provides tracking and precise location but no identification. The Defense Advanced Research Projects Agency and the Army are working on one approach with one vendor, another part of the Army is pursuing another with a second company, and the Air Force is exploiting technology developed by Massachusetts Institute of Technology Lincoln Laboratory.

ISR Task Force staff are aware of the need for an integrated capability, and understand that a joint technical or operational solution is needed if none of the alternatives can be readily modified to achieve the ability to detect and prosecute high-value targets. This SIGINT capability would be able to instantly cue an imaging sensor, such as a full-motion video or wide-area motion imagery camera, to enable persistent tracking.

The provision recommended by the committee would require the separate services and organizations sponsoring the different but complementary approaches to support the ISR Task Force Director in developing the plan. Finally, the provision reiterates the mission and role played by the ISR Task Force in identifying and overseeing ISR initiatives in support of deployed forces.

Collection and analysis of network flow data (sec. 928)

The committee recommends a provision that would require the Department of Defense (DOD) Chief Information Officer (CIO), in coordination with the Under Secretary of Defense for Intelligence (USDI), and the Under Secretary of Defense for Policy to take advantage of the research and development activities and capabilities of the Community Data Center (CDC) managed by the Defense Information Systems Agency (DISA) to enhance DOD’s capabilities to collect, analyze, and store so-called network flow data records. The purpose of the provision is to improve DOD’s capabilities to handle its own voluminous flow data records, and to potentially make this technology available for the defense of the country voluntarily through the Tier 1 Internet Service Providers (ISPs).

In a report to Congress required by section 934 of the Ike Skelton National Defense Authorization Act for Fiscal Year 2011 (Public Law 111-383), the Secretary of Defense stated that `the Department recognizes that deterring malicious actors from conducting cyber attacks is complicated by the difficulty of verifying the location from which an attack was launched and by the need to identify the attacker from among a wide variety and high number of potential actors.’ This attribution problem stems from lack of visibility into events and traffic taking place on the Internet.

In speeches and testimony to the committee, General Alexander, Commander of U.S. Cyber Command, has clearly stated that the government is not going to be monitoring the communications entering, leaving, or traversing the United States to detect cyber attacks. The committee concurs that such monitoring would be inappropriate and, in many cases, unlawful. However, General Alexander has also indicated clearly his conviction that, in the event of a nation-state attack on the country that Cyber Command is called upon to defeat, it is essential that Cyber Command be able to `see [the attack] in time to stop it.’ If Cyber Command cannot see the attack developing, `we’re not going to block it.’ General Alexander has indicated that the government will rely on the private sector–including the ISPs–to help provide that detection capability and warning.

The problem is that the ISPs currently do not collect the data necessary to reliably perform this type of early warning and attack assessment function. In traditional telephone service, the service providers collect and retain records of every call that is made–the `to/from,’ the time of the call, its duration, and the like. Under Federal Communications Commission rules, such call data records are comprehensively collected and retained for a long period.

In the case of the Internet, the analogue to the call data record is network flow data–for example, the `to/from’ of an email, the time it was sent, its size, and so forth. However, in contrast to telephony, the ISPs do not routinely or comprehensively collect or keep this net flow data, because there has never been a business reason for doing so. Over time, the ISPs have learned that such records are useful for cybersecurity, and have invested in dedicated equipment and specially developed analysis tools to capture and analyze them. But the amount of metadata for Internet traffic is staggering–orders of magnitude larger than telephony–and so even today the ISPs only gather records on a relatively small percentage of the traffic traversing their networks. What they do collect is immensely useful for cybersecurity, to be sure–this capability is a mainstay of ISP managed security services, used for identifying intrusions and compromised computers. But because the traffic records are only sampled, it is impossible to reliably and comprehensively detect threats, identify the origin of attacks, identify the command and control nodes controlling the operations, and provide early warning and characterization of major attacks in real time.

Providing warning and situational awareness for defending DOD networks, federal and state government networks, and those of critical infrastructure will require substantially greater net flow data collection and analysis. It is incumbent on DOD to invest in technology to make it easier and less expensive to achieve the scale required for the private sector to potentially assist the government in defending the country.

DISA’s CDC is already the focus for DOD’s research on the comprehensive collection and analysis of flow data records on the traffic that enters and leaves the Department. The CDC hosts technologies developed and being investigated by DISA and USDI. The committee believes that other commercial companies have additional capabilities to apply to this problem.

Department of Defense use of National Security Agency cloud computing database and intelligence community cloud computing infrastructure and services (sec. 929)

The committee recommends a provision that would prohibit the use of the National Security Agency’s (NSA) Accumulo cloud computing database by other Department of Defense (DOD) components after September 30, 2013, unless the Chief Information Officer certifies that there are no viable commercial open source databases that have the security features of Accumulo, or that Accumulo itself has become a successful open source database project. The provision also would require that Department of Defense and intelligence community officials coordinate fully on the use by DOD components of cloud computing infrastructure and services offered by the intelligence community for purposes other than intelligence analysis to ensure consistency with the DOD information technology efficiencies initiative, data center and server consolidation plans, and cybersecurity plans and policies.

The committee applauds NSA’s decision to adopt open source architectures and software for most of its cloud computing development. However, the committee disagrees with NSA’s decision to develop its own cloud computing database–called Accumulo–rather than adapt an open source version of the Google BigTable product. The committee understands that NSA decided to build a government solution several years ago because the open source systems lacked security features that NSA considers essential. In hindsight, the committee believes it would have been smarter for NSA to have worked with industry and the open source community to add NSA’s security features to the open source database systems. Indeed, the committee believes that NSA should pursue that course now.

The downside of using a government-unique database is that, compared to open source products, there will be far fewer developers contributing to technology advances, and all of the applications, analytic tools, and functionality that the open source community and commercial industry are developing for cloud computing will not be compatible or interoperable with the NSA database, depriving the government of valuable innovation. The committee believes it is very likely that the rapidly expanding and innovating cloud computing industry will pass NSA by in a hurry, with lagging performance and higher costs. The consequences would not be confined to NSA, since the Army intelligence community has already adopted the NSA cloud architecture, and NSA is strongly urging the entire Department of Defense to do likewise. One of NSA’s arguments is that the security features of Accumulo are essential for the cybersecurity of DOD as a whole.

NSA is making an effort to heal its divergence from the open source community by proposing Accumulo as an open source project under the Apache Foundation as a competitor to existing Apache Foundation open source databases like HBase and Cassandra, which are widely used and supported in industry.

The committee believes it is important to ensure that there are options available if Accumulo does not in fact become a viable and widely supported open source project. The committee is aware of commercial interest in the cell-level data tagging security features that NSA built into Accumulo. These features should be as useful to industry clients who need to protect and control access to the data resident in cloud facilities as they are to the intelligence community and DOD.

If Accumulo is successful, or if the commercial open source community produces nothing comparable to Accumulo’s security features, this provision would permit Accumulo to be used in DOD outside of NSA. But if Accumulo is not an open source success, and if industry follows NSA’s security lead, the Department should use a commercial product. This provision would give NSA almost 2 years since it made the Accumulo open source proposal to Apache to succeed. The deadline in this provision is also sufficiently distant to enable DOD components and industry to plan accordingly.

Under the direction of the Director of National Intelligence, the Central Intelligence Agency (CIA) and NSA are planning to provide cloud infrastructure and software `as a service’ to all of the intelligence community. As noted, NSA is offering a government implementation of open source standards. CIA is offering a competitively awarded commercial solution. These cloud services would cover all manner of computing needs and capabilities, not just intelligence analysis. This provision is also intended to ensure that DOD components’ use of these cloud services is consistent with DOD information technology policies and plans. The committee notes that the software services that the CIA may offer to DOD customers could include analytic databases like HBase and Cassandra, and other commercial open source products, potentially providing an alternative to Accumulo.

[CTO Note: I don’t blame NSA at all for taking their approach. They have important missions and needed to innovate to get them done. But the committee raises some strategic issues that I believe are the right approach from a national perspective. I think with leadership the commercial community and open source community can address any issues found in the Apache Foundation code projects. ]

Electro-optical imagery (sec. 930)

The committee recommends a provision that would require the Secretary of Defense and the Director of National Intelligence to sustain through fiscal year 2013 the commercial imagery collection capacity planned under the Enhanced View program approved in the National Defense Authorization Act for Fiscal Year 2012 (Public Law 112-81).

This provision would require the Vice Chairman of the Joint Chiefs of Staff to conduct a comprehensive analysis of imagery requirements for the Department of Defense (DOD). The provision would also require a study of the potential role of commercial-class imagery in meeting the needs of the government. This provision would require the completion of these studies in time to inform decisions on the fiscal year 2014 budget and the fiscal year 2015 budget request.

This provision would require the Congressional Budget Office (CBO) Director to examine whether the administration’s proposed actions on commercial imagery are consistent with Presidential policy directives, the Federal Acquisition Regulation (FAR), and statute, all of which use similar language regarding the use of commercial products and capabilities. The President’s most recent National Space Policy, for example, directs the Executive Branch to `purchase and use commercial space capabilities and services to the maximum practical extent when such capabilities and services are available in the marketplace and meet United States Government requirements [and] modify commercial space capabilities and services to meet government requirements when existing commercial capabilities and services do not fully meet these requirements and the potential modification represents a more cost-effective and timely acquisition approach for the government.’

Similarly, the FAR requires government agencies to `determine if commercial items or, to the extent commercial items suitable to meet the agency’s needs are not available, nondevelopmental items are available that meet the agency’s requirements, could be modified to meet the agency’s requirements, or could meet the agency’s requirements if those requirements were modified to a reasonable extent.’ Very similar language is contained in section 2377 of title 10, United States Code.

The committee has been unhappy with Executive Branch planning for imagery collection from space for years. A series of very expensive programmatic debacles by the National Reconnaissance Office (NRO) was followed by years of chaotic lurches in policy and program planning that reflected deep conflicts among the organizations responsible for operational requirements, acquisition oversight in DOD, oversight of intelligence in DOD, the imagery functional manager in the intelligence community, the NRO, and the Office of the Director of National Intelligence (ODNI).

Several years ago, the Vice Chairman of the Joint Chiefs of Staff and Chairman of the Joint Requirements Oversight Council stated that DOD required a more survivable wartime capability to collect wide-area imagery at moderate resolution, and advocated a larger constellation of commercial satellites to meet this requirement.

The Vice Chairman subsequently proposed that commercial satellites, with some performance modifications, be acquired as the primary means of collecting electro-optical imagery from space for DOD and the intelligence community, based on modeling and analysis conducted through the Joint Staff.

The Director of National Intelligence (DNI) rejected that proposal in favor of evolving legacy government-developed capabilities and attempting to rapidly develop new ones. This debate was carried to the White House before it was short-circuited by an agreement struck between the DNI and the Secretary of Defense. The Secretary of Defense agreed that DOD would not challenge the NRO’s plan to acquire new imaging capabilities. But the Secretary of Defense concluded that the DNI’s plan presented significant risks of gaps in capability and that prudence dictated an expansion of commercial imaging capability to meet DOD wartime requirements and to mitigate the risk of collection shortfalls.

This plan was put into effect at a vigorous pace. The government appealed to the commercial data provider companies to more than double their on-orbit collection capacity in a short time with the promise that the government would consume a large proportion of that capacity over a 10-year period. The commercial data providers, on the basis of this government commitment, proceeded towards adding the equivalent of two more satellites that would be available to meet government requirements.

Meanwhile, the Senate Select Committee on Intelligence (SSCI) continued to study the potential for commercial-class satellites to meet all of the government’s needs. The committee commissioned an evaluation by its scientific advisory body, building on the previous assessments of the former Vice Chairman of the Joint Chiefs of Staff, that concluded that enhanced commercial satellites could meet or exceed the overwhelming majority of electro-optical imaging requirements at less cost and risk, and that such an approach would provide greater resilience and survivability, and a broader, competitive industrial base.

The DNI then sponsored its own analysis, which concluded that a larger number of smaller commercial satellites would not be less expensive than the satellite capabilities the government planned to acquire. The Senate Committee on Armed Services directed an additional analysis by the Joint Staff and the Cost Assessment and Program Evaluation office to establish a new requirements baseline and the cost of meeting those requirements with alternative spacecraft designs in the classified annex to the National Defense Authorization Act for Fiscal Year 2012 (Public Law 112-81). DOD declined to conduct this analysis.

After deciding that commercial-class imaging satellites should not replace the satellites built under NRO program management, the ODNI and the Under Secretary of Defense for Intelligence (USDI) in late 2011 took the position that the commercial augmentation program approved by former Secretary of Defense Gates was not needed and should be terminated in order to save money. This proposal was not based on a new requirements assessment, or a risk assessment, but nonetheless was presented to Congress in the President’s budget request for fiscal year 2013. The Office of Management and Budget allowed the drastic cutback to commercial imagery to be presented to Congress with the fiscal year 2013 budget, but directed ODNI, USDI, and the Joint Staff to study whether this action was warranted.

This obviously reverses the proper order of events. Appropriate reexamination of the foundation for a major program established by the Secretary of Defense and the Vice Chairman of the Joint Chiefs of Staff should precede a decision to terminate it.

The need for care is underscored by the fact that the plan presented in the budget request would result in the elimination of one of the two competing commercial data provider companies from government procurement, and could result in the collapse of one of the companies or a merger or acquisition. In that eventuality, it appears that the government’s impetuous actions would be the determining cause: it was the government that encouraged the two companies to dramatically expand their collection capacity, including by borrowing money to accelerate satellite acquisitions; and when the government then just as suddenly announced that it no longer wanted this additional capacity, the stock price of both companies plummeted. While the government is not legally liable, the government’s wild swing in demand has exposed two healthy companies to financial risk, and, if one is forced to exit the business, will lead to a monopoly in the market and a narrowing of the government’s options for future procurements.

The committee believes that the status quo ante should be restored, as far as possible, until additional studies are conducted to allow both the Executive Branch and Congress to make informed decisions.

The committee believes that there is ample evidence that commercial imaging capabilities could today satisfy a large majority of military requirements, and, if modified for enhancements, could meet all but the most stressing requirements. In the case of the most stressing requirements, the question is whether the benefit of meeting them is justified by the marginal cost of doing so. As noted above, the ODNI study concluded that meeting the government’s needs for electro-optical imagery from space can be accomplished better and at somewhat less cost by a government acquisition of a different design. The committee believes that additional, independent analysis of this contention is essential.

The ODNI and USDI concluded that the government is acquiring capabilities to collect more imagery than is required, and recommended lopping off commercial acquisitions. In light of the pressures on the budget, and the policy expressed by former Secretary of Defense Gates to seek the `80 percent solution’ rather than `exquisite’ ones, three questions need to be addressed:

(1) Is the originally planned collection capacity actually in excess of DOD’s wartime requirements?

(2) If yes, is the correct response to reduce commercial acquisitions or to scale back or eliminate the government-developed solution? and

(3) If no, should the original Enhanced View commercial imagery program be restored and sustained or should the government-developed solution be expanded?

The committee also directs the CBO Director to make use of performance modeling results and analyses already available from the Joint Staff, industry, and the SSCI; and cost modeling, existing cost estimates, and actuals from ongoing programs in government and industry. As necessary and appropriate, the Director should conduct new cost modeling and estimating to ensure that a variety of reasonable assumptions and acquisition approaches are considered to achieve confidence in cost comparisons.

This provision would authorize $125.0 million to sustain the level of commercial imagery collection at roughly the level that will be maintained throughout fiscal year 2012 and through the first quarter of fiscal year 2013 through existing Service Level Agreements.

Software licenses of the Department of Defense (sec. 931)

The Department of Defense’s (DOD) $37.0 billion information technology (IT) budget request constitutes almost half of the Federal Government’s overall IT budget for fiscal year 2013. Given the size of this investment, it is important that the Department rationalize current IT outlays and identify those investments that need to be integrated with other solutions or curtailed to ensure the most efficient use of resources. One area of duplication and inefficiency identified by the Government Accountability Office and by the Department is that of software licenses. In addition, the committee has recognized that application and software reduction is a key component of IT efficiencies, as referenced in section 2867 (b)(1)(A)(v) of the National Defense Authorization Act for Fiscal Year 2012 (Public Law 112-81).

The committee recommends a provision that requires the DOD Chief Information Officer (CIO) to conduct, within 180 days of the enactment of this Act, a Department-wide inventory of software licenses, examine license utilization rates, and assess the current and future Departmental need for software licenses. Based on the results of this assessment, the CIO shall establish a plan to align the number and type of software licenses with the needs of the Department. Upon completion of the inventory and assessment, the CIO or their designee shall brief the results to the congressional defense committees.

Immediately upon completion of the inventory and assessment, the CIO is directed to provide the data and findings to the Comptroller General of the United States. Not later than May 1, 2014, the Comptroller General will deliver to the congressional defense committees a review of the Department’s assessment and performance plan.

The committee recognizes that defining the scope of software license duplication throughout the military departments, components, and major commands will be challenging, and urges the CIO to apply the utmost diligence in the undertaking of this inventory and assessment. Lastly, the committee acknowledges that the CIO may need the ability to grant exceptions to certain instances of software, depending upon individual circumstances, such as classification.

[CTO Note:Wow! It is about time! What a great idea!. ]

Defense Clandestine Service (sec. 932)

The committee recommends a provision that would prohibit the obligation of appropriated Military Intelligence Program (MIP) funds in fiscal year 2013 to exceed the number of personnel conducting or supporting human intelligence within the Department of Defense (DOD) as of April 20, 2012. This provision would also require the Office of Cost Assessment and Program Evaluation (CAPE) to provide an estimate of the total cost of the Defense Clandestine Service (DCS) to the congressional defense and intelligence committees. This cost estimate should look at the total costs of the DCS, including whether that cost is incurred in the MIP, in the National Intelligence Program, or in other non-intelligence funding for the Department of Defense (e.g. Major Force Program 11 funding for U.S. Special Operations Command (USSOCOM)). The estimate should include costs in the out years of the future-years defense program and beyond, especially those associated with closing existing personnel basing; creating new basing arrangements; and supporting overseas deployments.

The provision also would require the Under Secretary of Defense for Intelligence (USDI) to provide a report to the congressional defense and intelligence committees by February 1, 2013, that provides or explains:

where DOD case officers will be deployed or based and a schedule for those deployments;

certification that the prospective locations can and will accommodate these deployments;

the objectives established for each military service, USSOCOM, and the Defense Intelligence Agency (DIA) to improve career management for case officers, and the plans to achieve the objectives of the DCS; and

any Memoranda of Agreement or Understanding necessary to implement planned reforms with other departments and agencies and between DOD components.

The committee appreciates the fact that the USDI and the Director of the DIA, in initiating the DCS, intend to make reforms to the Defense Human Intelligence (HUMINT) Service to correct longstanding problems. These problems include inefficient utilization of personnel trained at significant expense to conduct clandestine HUMINT; poor or non-existent career management for trained HUMINT personnel; cover challenges; and unproductive deployment locations. Multiple studies since the end of the Cold War document these deficiencies, and they led the Commission on the Roles and Capabilities of the United States Intelligence Community, chaired by two former Secretaries of Defense, to recommend transferring to the Central Intelligence Agency (CIA) all responsibilities for the clandestine recruitment of human sources, utilizing military personnel on detail from the DOD as necessary.

The committee notes that President Bush authorized 50 percent growth in the CIA’s case officer workforce, which followed significant growth under President Clinton. Since 9/11, DOD’s case officer ranks have grown substantially as well. The committee is concerned that, despite this expansion and the winding down of two overseas conflicts that required large HUMINT resources, DOD believes that its needs are not being met.

The committee concludes that DOD needs to demonstrate that it can improve the management of clandestine HUMINT before undertaking any further expansion. Furthermore, if DOD is able to utilize existing resources much more effectively, the case could be made that investment in this area could decline, rather than remain steady or grow, to assist the Department in managing its fiscal and personnel challenges.

Authority for short-term extension of lease for aircraft supporting the Blue Devil intelligence, surveillance, and reconnaissance program (sec. 933)

The committee recommends a provision that would allow the Secretary of the Air Force to extend or renew the current lease of aircraft to support the Blue Devil intelligence, surveillance, and reconnaissance (ISR) program. Section 2401 of title 10, United States Code, limits such leases to 5 years. The lease for the Air Force Blue Devil ISR aircraft deployed first in Iraq and now in Afghanistan will expire in September 2013. This ISR system is extremely useful, having contributed directly to the take-down of large numbers of high-value targets in the Kandahar region. Theater commanders have requested that this asset be sustained indefinitely in theater. The committee is deeply concerned that, despite a year’s notice from the committee that plans had to be made to address this lease expiration, the Department of Defense has yet to decide on a definite course of action to sustain this capability.

The committee understands that the Air Force leadership and the ISR Task Force are seriously examining alternatives to comply with section 2401, including buying the existing Blue Devil aircraft, buying new C-12 aircraft, modifying existing Liberty aircraft, and modifying the Reaper unmanned aerial vehicles equipped with Gorgon Stare wide-area motion imagery systems.

Based on information that the Air Force provided to the committee, the most attractive option is to purchase additional C-12 aircraft. This option would deliver the needed capability the soonest. All other options under consideration take considerably more time, or would entail reducing deployed ISR support in order to modify existing aircraft, or would not satisfy requirements.

The committee directs the Secretary of the Air Force, in coordination with the Director of the ISR Task Force, to provide a recommendation on a way forward by September 1, 2012, along with a program schedule that would enable Congress to set an appropriate end date for the lease extension.

Sense of the Senate on potential security risks to Department of Defense networks (sec. 934)

The committee recommends a provision that would express the sense of the Senate regarding potential risks to the security of Department of Defense (DOD) networks from the incorporation of equipment and software from foreign sources, and the need for DOD authority and processes to mitigate such risks beyond those that already exist for covered National Security Systems acquired by DOD. The committee provided existing authority to address cybersecurity supply chain risks to DOD National Security Systems in section 806 of the National Defense Authorization Act for Fiscal Year 2011 (Public Law 111-383).

The provision acknowledges the difficulty involved in blocking sales of information technology systems and services due to concerns about cybersecurity while maintaining our commitment to free trade and fair and transparent competition.

[CTO Note: Let me know if you have any questions/comments/thoughts on this! ]

This post by was first published at CTOvision.com.

Original post

Leave a Comment

Leave a comment

Leave a Reply