Astronomical amounts of data flow through government agencies every day. With all of that comes increased security, storage and analytic requirements that can cause agencies to stumble through the process. Limited resources further add to difficulties, making it crucial for agencies to find a solution that can manage all of their data efficiently.
Luckily, new technology has begun to allow progress to be made in this field. With security and efficiency at the forefront, agencies have begun to use data governance techniques and programs that keep them on top of their immense data stores.
To discuss the importance of protecting data, GovLoop spoke with Guy Cavallo, Deputy CIO, SBA, and Darin Pendergraft, VP of Product Marketing, STEALTHbits, during Thursday’s online training.
To begin, Cavallo noted that an overload of individual security products puts holes in data protection efforts. Consolidation for efficient management is a crucial consideration when facing this issue.
“We’re measuring our data from our routers, our endpoints, from everything in our enterprise, so we’re shrinking this down,” Cavallo said. “Wherever it is, on-premise or in the cloud, we want to bring that data back and be able to deal with it in one place. At SBA, we really believe we have it now with less than 10 products pulled together.”
He mentioned the single-pane-of-glass mentality that encourages agencies to condense all of their tools into an easy-to-manage system. This increases security through strong analytic tools.
“What we’ve been able to do is use a tool that takes all of our log data,” Cavallo said. “Not only routers and servers, but endpoints are all feeding in. Where is the SBA traffic going, who are the top users, what are the things they’re doing? One thing I constantly look it is where our traffic is going.”
To put it simply, everything is being tracked, from what data is being used, how often, in what way and more. For an example of how this factors into security, Cavallo mentioned that if a login occurs in a different country than usual, the risk is identified and either confirmed or denied as a threat in a matter of seconds.
For an idea of other security alerts, strong data governance will detect improbable or impossible travel, suspicious activity, mass sharing, deleting, or downloading, and more.
“We went from no visibility, where we couldn’t tell what was running on our networks, to now being able to see everything,” Cavallo said. “We are able to provide superior protection.”
To break down the structure of data access governance, Pendergraft noted five steps in the process: discover, collect and analyze, monitor, restructure and govern.
The first step, discovery, gives a strong idea of an agency’s data footprint. Collecting, analyzing and monitoring that data provides a strong view into what an agency’s data looks like and how it is used.
The last two steps, restructure and govern, are where security controls are added. Restructuring allows agencies to restrict data access to reach the Least Privilege principle, which ensures that people only have access to the data that they need, no more and no less. Governance is an ongoing process which keeps security, compliance and operational standards in check in the long-run.
“Figuring out who’s got access to data and if that data is sensitive or proprietary is important,” Pendergraft said. “When we talk about data access governance, what we’re trying to do is understand the data that’s out there and make access to that data exclusive to the people who need it. Overprovisioning users can lead to a lot of problems, so you want to make sure they have the right level of access.”
Looking to 2019 and the future of data governance, Cavallo noted the need to “continue to enhance our use of these tools and capabilities and continue to educate our users on the risks of cybersecurity,” particularly emphasizing the human side of the process.
“This is something we’re using today,” Cavallo said. “This isn’t the promised future; we have it now.”