GovLoop

Is Machine Learning the Next Frontier in Data Analytics?

We’re living in an age where artificial intelligence (AI) literally lives in our pockets. It answers our questions, guides our navigation and manages our social media feeds. Now, the federal government is taking advantage of this technology to save money and produce better data analytics.

“AI is not really about the future anymore. It’s right here. Everything nowadays involves AI and machine learning,” explained Dongwha Kim, Director of Application Engineering at NewWave, an organization specializing in government healthcare IT.

At GovLoop’s Thursday online training session, How Machine Learning Is Changing Data Analytics in Government, Brian Dirking, Senior Director of Partner Marketing at the unified analytics platform provider, Databricks, joined Kim to share how new data analytics tactics are changing the way that government does its job.

These new data analytics tactics include advanced analytics, AI, machine learning, and deep learning. While the federal government uses all four, Dirking and Kim focused on the benefits of AI and machine learning.

While AI and machine learning are often used interchangeably, there are important differences between the two. AI enables computers to mimic human behavior. Machine learning is a type of AI where computers are able to learn from data without being explicitly programmed to do so.

Data analytics tactics help agencies provide better services, protect against threats and create more efficient programs. While these goals could be met via long hours in the lab and numerous employees, automation is often cheaper and faster.

“By unifying data and AI together, we can now train systems to make decisions for us,” Dirking said.

This allows data analysts to spend their time providing solutions instead of manually sifting through data. For example, Dirking explained how extract, transform, load (ETL) processes can simplify analysis of population health data. This process takes data from multiple sources and combines it into one system so that the different data formats can be used for the same task.

Instead of ETL and other data processes taking days or even weeks to transfer that important information from one agency to another, Kim explained, it can now happen in just an hour.

Faster transmittance and automated simplification of data through these processes means that analysts can begin working towards solutions much faster. In the case of population health data, these analytics methods could provide faster, more thorough information about Medicare fraud and disease outbreaks so that experts have more time to focus on providing guidance for remediation.

Of course, making the shift to one of these data analytics tactics is expensive. Dirking cited the federal government’s Big Data Research and Development Initiative to demonstrate why it’s a worthwhile investment. The initiative allocated $200 million towards data-driven automation efforts but, once the systems were up and running, the program freed up 266 million hours of work and, ultimately saving the government $9.6 billion.

The importance of useful data for government decision-making is obvious, but the era of lengthy data scrubbing done by hand is over. Now, data aggregation and analysis can be done quicker, cheaper and more efficiently through the use of data analytics tools.

Photo credit: Unsplash

Exit mobile version