Big Data: Are You Ready?

We love buzz words in government. Cloud, SOA, and now Big Data. There is certainly a lot of hype around the buzz. “Keeping Afloat in a Sea of ‘Big Data”, ITBusinessEdge, “The promise of Big Data” Intelligent Utility, and “The challenge-and opportunity-of big data” are just a few of the headlines contributing to the big data buzz.

Big data is and will be a major shift in the way we consume and process data. With big data organizations are realizing the power of data and specifically how to bring together, analyze, and streamline it to make more informed decisions. Government has the opportunity to leverage big data to make very specific and significant decisions that impact the industry, citizens and to move the country forward. That is big!

However, before we get there, organizations need to fully understand “where do we start, and how”?

During the event, “Big Data Summit for the Federal Government” held May 10th in Reston, VA, speakers Isidore Sobkowski, CIO, NYC Health & Human Services, Peter Doolan, GVP& Public Sector CTO, Oracle, and Matthew Mead, Sr. Systems Engineer, Cloudera spoke to an audience of government and industry participants to provide solutions and use cases to help organizations acquire, organize, and analyze big data alongside traditional enterprise data for immediate value.

The question was initially posed, “What makes big data?” Presenters described that fundamentally big data can be broken down into 4 characteristics:

  • Volume: Machine-generated data is produced in much larger quantities than non-traditional data. A single jet engine can generate 10TB of data in 30 minutes. Smart meters and heavy industrial equipment like oil refineries and drilling rigs can generate petabytes of data every day.
  • Velocity: Machine-generated data streams (log files, network traffic data, etc) tend to be deleted or archived where it is never accessed because there is no way for organizations to analyze the data at the rate it is being generated. Big data techniques can use this data to uncover valuable “operational intelligence” on system performance and suspicious activity early enough to act on the results.
  • Variety: OLTP data is relatively well described and has a fixed schema. In contrast, non-traditional data sets like tweets, blogs, wikis, images, video, audio and the like vary widely and exhibit a dizzying rate of change. As new services are added or new sensors are deployed, new data types are needed to capture the resultant information.
  • Value: Usually there is less economic value in a “Big Data” set than in an OLTP database of the same size. This does not mean there is no value, just that different techniques must be used to elicit that good information.

Big data is expected to help organizations make better decisions. According to presenters, there is a lifecycle of big data to facilitate informed decision making which includes:

  • Phase 1- Acquire: Making the most of big data means quickly acquiring a high volume of data generated in many different formats.
  • Phase 2- Organize: A big data platform needs to process massive quantities of data in batch and in parallel—filtering, transforming and sorting it before loading it into an enterprise data warehouse.
  • Phase 3- Analyze: Analyzing big data within the context of all your other enterprise data can reveal new insights that can have a significant impact your program and delivery of services.
  • Phase 4- Decide: Upon analyzing big data, you need to know “what did we get?” from the investment of time and resources dedicated to the previous phases of the big data lifecycle. At this final stage of the lifecycle, big data has been transformed into actionable insight.

To translate these phases into action and derive real value from big data, organizations require the right tools to capture and organize a wide variety of data types from different sources, and to easily analyze it within the context of all enterprise data. Peter Doolan, GVP & Public Sector CTO, Oracle introduced a platform for big data that addresses each phase of the lifecycle allowing organizations to acquire, organize and analyze all their data – including structured and unstructured – to make the most informed decisions. Peter revealed Oracle’s big data strategy is centered on the idea that organizations can evolve current enterprise data architecture to incorporate big data and deliver operational value, leveraging the proven reliability, flexibility and performance of existing data warehouses and systems to specifically address big data requirements. Oracle is uniquely qualified to combine everything needed to meet the big data challenge – including software and hardware – into one engineered system. Details of the framework and solutions are outlined in the Oracle Big Data for the Enterprise White paper and in the below referenced presentation, “Big Data- Are You Ready?” delivered at the Big Data Summit.


Isidore Sobkowski, CIO, NYC Health & Human Services, provided a use case example of how the organization is using the Oracle Big Data platform to optimize the use of big data to have a big impact on the quality of life for the citizens of New York. The organization’s program office, HHS-Connect, services over 2M citizens daily and administers over $20Billion in benefits and services annually. Their primary goal with managing big data is to offer a holistic approach to their clients and client data. Acquiring, organizing and analyzing big data gives the organization the ability to collect information report it and “just-in-time” provide the right recommendation at right time to the right person with the best possible outcome.

Big data helps the organization address questions they were never able to address before that deliver real, insightful information that significantly impacts how they service citizens. “How can we better service a given population?” “What can I do for this person, at this moment to achieve the best possible outcome?” “What is the impact of reduced funding on benefits and services?”

“Big data allows us to provide a 360 degree view of a client in order to effectuate the best possible outcome for our citizens,” said Isidore. Big data has changed how the organization looks at data, and helps the organization identify exactly where a problem is occurring to enable commissioners, analysts and other employees the opportunity to drill down further into the information to make insightful decisions and recommendations. Their big data initiative allows the organization to provide holistic needs and client services to their constituents.

Check out all the presentations from the Big Data Summit for the Federal Government at May 10 Big Data Summit Presentations.

Leave a Comment

Leave a comment

Leave a Reply