,

Improve Resilience in the Era of More

Recent events have been a whirlwind, but agencies aren’t out of the storm just yet. Even after the COVID-19 season passes, communities will still find themselves in tumultuous times.

That’s where resilience kicks in.

“It all comes down to an agency’s most critical asset: its data,” said Barry Levine, Director of Federal Health Care and Civilian Strategic Programs at Veritas, which specializes in enterprise data solutions.

Merriam-Webster defines resilience as the “ability to recover from or adjust easily to misfortune or change.” And believe it or not, data largely determines organizational resilience. If agencies have the information at hand to make decisions, they can successfully anticipate and respond to challenges.

Agencies need a plan for resilience – specifically, a data strategy. After the Federal Data Strategy launched in 2019, many federal, state and local agencies took it as inspiration to develop their own. In the ongoing COVID-19 climate, that realization has become even more critical. When drafting a data strategy focused on resilience, it’s key to keep in mind:

1. VISIBILITY

Visibility makes security, analysis and compliance easier. The goal, Levine said, is a single-pane-of-glass view so that agencies can explore all of their data repositories at once.

For some agencies, a majority of their data is dark – meaning they don’t know what it is.

“Step one, get visibility into your data,” Levine said. “Understand what data’s out there and where it resides. Find out if it’s useful.”

By gaining visibility into data, storage and backup infrastructure, agencies take control over dataassociated risks and IT infrastructure. No longer can they be unaware.

2. AVAILABILITY

If data isn’t available, it’s useless. Agencies can’t protect it, use it or analyze it.

Even more than unlocking insights, agencies with highly visible and available data can find information faster and are more resilient. But just as important as using valuable data is getting rid of unimportant data. That data does nothing but drive up costs and add risk, especially in the cloud.

“Having a long-term retention strategy before it moves to the cloud could save an agency millions upon millions of dollars,” Levine said.

Veritas has a classification engine that alerts agencies to security or compliance risks for their data. It can also assign different policies for various datasets.

3. PROTECTION

Cybercriminals attack governments more during crises. Nowadays, preparedness requires going beyond simple backup and recovery to enterprise data protection, Levine said.

Traditional backup and recovery solutions can fail in several ways. First, they might not capture all recent data within a few days’ period, which would be disastrous if a major application is impacted. Second, data may take weeks to be recovered fully, if it even can be. For many services, such as housing applications and tax payments, “weeks” is not a reasonable timeframe.

Enterprise data protection speeds up recovery and safeguards backups across systems, which is especially important with many agencies placing data in the cloud. Agencies can even test their systems without going offline.

“Start working on a unified enterprise data protection strategy,” Levine said.

This article is an excerpt from GovLoop’s recent guide, “Your Data in the Year of Everything Else: A GovLoop Guide.” Download the full guide here.

Leave a Comment

Leave a comment

Leave a Reply