Today, we live in a connected world, moved along by massive streams of data and accelerated by the Internet of Things (IoT). There will be 7 billion connected devices by 2020, with a potential economic value of $11 trillion.
Meanwhile, agencies are looking to use the data IoT is generating to meet citizen demands for a smooth user experience. But they don’t have proper systems in place to do so, and they face myriad challenges such as inadequate analytics platforms, data silos and repetitive, manual processes.
“We know that agencies don’t have a problem of having enough data,” Osborn said. “They’re generating tremendous amounts of data from various systems. But the challenge is normalizing that data so that they can make sense of it.”
The main obstacle for agencies is the multitude of individual, disparate systems that are creating data related to specific subjects. These data silos make it incredibly difficult to get a holistic view because systems don’t talk to one another, resulting in manual, slow data analysis.
In short, traditional data platforms are insufficient in meeting the demanding requirements of a government that must move fast and make data-driven decisions.
“We used to look at creating enterprise data warehouses in which we would gather the data in one place and attempt to normalize it in that fashion,” Osborn said. “But the sheer volume of data has now outstripped the ability to do that. In addition, once you aggregate the disparate data in different formats that have been created by a multitude of different technologies, there’s really no way to run any analytics on that. It must be normalized at some point.”