, ,

Five practical applications of government big data

2013 may go down as the year of big data and data analytics. But according to a new report by IBM and the Partnership for Public Service, big data has been around since the 1970s.

The report titled,From Data to Decisions III – Lessons from Early Analytics Programs, showcases five government cases studies where agencies have been using big data to make informed decisions for decades.

Brian Murrow is an associate partner in IBM’s global business services and one of the authors of the report. He told Chris Dorobek on the DorobekINSIDER program that we can learn a lot of these early case studies.

“The purpose of the study was to go back decades, some of these things started in the late 70s, and be able to see what operational lessons learned, programmatic lessons learned, we can use from programs that have been going on for decades,” said Murrow.

Data can help you assess your return on investment?

“Being able to dive into the data to see how effective really is this program, what is the return on investment? Should we be continuing to do this program or are there better programs that we ought to be focusing on?” said Murrow.

Big data enables collaboration?

“Big data by definition is big. It crosses intra-organizational boundaries. Some of the technologies allow you to fuse that data together to make sense of it, when you are pulling it together and collaborating from multiple organizations. The technologies that are more recent, which is why big data has been getting more attention these days enables you to do more with that data like being able to collaborate and bring it all together.”

5 Practical Big Data Case Studies:

  • Social Welfare: The oldest case is the Famine Early Warning Systems Network (FEWS NET) developed by U.S. Agency for International Development in 1986. The $25 million dollar program helps optimize the distribution of up to $1.5 billion dollars per year in USAID Food for Peace assistance.
  • Public Health: The Center for Disease Control and Prevention (CDC)’s PulseNet consists of 87 public health laboratories across the nation and consolidates foodborne illness cases into a single source. PulseNet serves as an early detection system to help prevent epidemics.
  • Biometric Intelligence: Since 2003, US armed forces have collected biometric information from non-US citizens in Iraq and Afghanistan. It identifies enemy combatants and permits access into controlled areas. The system, known as The Automated Biometric Identification System (ABIS), stores 4.4 million unique identities and has identified over 3,000 enemy combatants, added 190,000 identities to the watch list, and protected the welfare of the United States and its allies.
  • Intelligent Inspections In 2007, the USDA initiated a program to make risk-based decisions to determine which shipping containers should be examined for plant-borne pests. The Agricultural Quarantine Activity System (AQAS) enables the Plant Health Inspection Service to better utilize their resources to tackle a $136 billion dollar problem.
  • Smarter Healthcare: Predicting the likelihood of hospitalization or death within 90 days, the Patient Care Assessment System (PCAS) calculates the Care Assessment Needs (CAN) Score. This score allows the Veterans Health Administration (VHA) to focus care teams and proactively care for their patients. The system collects 120 unique elements for 5.25 million patients and is supported by an 80-terabyte corporate data warehouse.

The year of big data?

“There are a couple of developments technologically that have occurred. When you think about the folks in these case studies that we have been following for a long period of time, folks that devote their lives to public policy, and furthering a public sector mission, they’ve been doing this stuff for a long time and they’ve been in this business because of the mission because they want to make a difference,” said Murrow. “So at one level that need has always been there. But it hasn’t always been easy. In the past couple of years some new technologies have come out that have made it a little easier for organizations and for individuals within the organization to become more effective with analytics. One is hadoop. It is a mapreduce framework that allows large quantities of data to be divided up so you can analyze it much more quickly. You can analyze billions of records and conduct advanced analytics on it without creating a complex architecture that was previously required. It is a much more cost effective way to do it.”

Recommendations?

  • Define business requirements
  • Plan to augment and iterate
  • Create a big data entry point
  • Identify gaps
  • Iterate again

“Be curious about how your organization can be more effective. Achieve its goals better. That is thinking through the ROI thing. And then just do it. I will tell you, it won’t be perfect the first time out of the gate. So, you have to be willing to iterate. None of these case studies in the report, have been around for decades. It didn’t happen overnight. They have been triangulating in on the most optimal way to achieve their program priorities,” said Murrow.

Leave a Comment

Leave a comment

Leave a Reply