When the Health and Human Services Department (HHS) was identified as a new federal center for shared services, Michael Peckham, the department’s ReInvent Grants Management Lead, knew that the designation signified a change of mindset in government. The administration’s order to establish Quality Service Management Offices (QSMOs) that would lead shared services partnerships across different agencies was a major step for the cross-agency priority goal of eliminating duplication.
Federal Chief Information Officer Suzette Kent revealed HHS as the QSMO for grants management at the end of April, meaning that HHS would be responsible for procuring and managing solutions that could be hosted as intergovernmental resources for grants. Sharing solutions is part of a wider federal governmental focus to tear down barriers and promote communication – something that is especially important, and regularly absent, when it comes to data.
“There are a lot more people concerned about the standards that are in place,” Peckham said. Peckham was one of two panelists on Thursday’s GovLoop online training, “Simplify and Scale Data Pipelines.” Also featured on the webinar was Alon Lebenthal, Senior Solutions Marketing Manager of Digital Business Automation, Cloud and Big Data for BMC.
Big data, as it’s called, has taken over government and forced agencies to piece together policies and practices that will allow them to manage all of their incoming information securely. At best, government data is a tremendous resource that offers insights, saves money and eliminates waste. However, there is also the potential when the right policies aren’t in place that data leaves agencies vulnerable to attacks and overwhelmed.
Agencies are looking for ways to standardize and process all of their incoming data, something that the federal government has earmarked as a national focus following the release of the Federal Data Strategy. The Federal Data Strategy identified sharing data across federal agencies and using data to power insights as priorities. Still, many agencies are struggling with standardizing terms, formats and usages.
“Even in the QSMO, they realize that we have 426 standard terms that are just waiting to be finalized,” Peckham said. “But having a definition around a term that has yet to be finalized really doesn’t help us much. We have to put the schema around that.”
Finding the right solution to capitalize on the power of big data has been trying for agencies. Organizations usually interact in four ways with their data. They ingest data from multiple sources, store it, process it and then deliver insights. Federal agencies, however, are faced with additional challenges of data security and transparency, as mandated by laws and regulations.
The data pipeline, or the movement of data through this lifecycle of usage, can be automated to improve its accessibility and compliance.
“Big data is not an island, and this ensures that big data does not become another silo of systems,” Lebenthal said.
Automating the data pipeline ensures that organizations’ data is not wasted. Compliance is enforced throughout, and governmentwide policies can be implemented during the process. Furthermore, data can be shared throughout organizations – improving financial performances, speeding up processes and executing decisions.
BMC’s Control M, Lebenthal said, helps developers quickly push from coding to production and allowed for the easy updating of systems. All activity can be audited as well.
On the training, Lebenthal and Peckham emphasized that in building out data-based applications and capabilities, adopting organizational mindsets of Agile and user-centered design is fundamental. These approaches, which aim for fast iteration and user-friendliness, play into the cultural side of data management. Listening to stakeholders while developing a responsive product advances the government’s usage of data and creates opportunities for the public sector to innovate in real time.
“I had a lot of ideas and a lot of things I wanted to do, but I think that we all know that the successes of the Ubers and Amazons today is that they focused on user-centered design,” Peckham said.
This is a very timely post, especially as conversation around data strategy ramps up. I especially appreciated the QSMO example, as it really helped understanding the context around data pipelines and how data changes through its lifecycle at an agency or agencies.