, ,

Suffering From Data Downtime? You’re Not Alone

Today, data has become a new natural resource and is powering many services we rely on. Think about your smartphone: every app and service is powered by data. From restaurant and movie reviews, to GPS directions and weather alerts, authoritative data, often made public by government, helps provide these services.

With access to data being mission critical, government agencies now have to reduce data downtime to drive improved efficiency. That’s why Symantec and Meritalk recently collaborated to produce the report, “The Drive to Thrive: Ensuring the Agile Data Center.” The study included a survey of Federal IT professionals, in which 80 percent noted that data center reliability was the top priority for their agency. To further discuss the report, GovLoop spoke with Skip Farmer, Solutions Architect at Symantec, and Tom Tittermary, Systems Engineering Manager at Symantec.

“Today, the amount of data growth is just incredible,” said Farmer. As government systems continue to rely more and more on data to meet mission need, agencies must consider how to make data more accessible and eliminate downtime.

The report noted that by providing real-time access, organizations can save employees 17 hours per week, or 816 hours per year – which equals $32.5 billion in annual savings. The report also found that when downtime occurs, 42 percent of field workers are temporarily unable to conduct their work.

“Expectations are higher from end users. There’s an expectation based on all the infrastructure that data will be available exactly when needed and there won’t be any challenges to access it,” said Farmer. So when challenges do arise, employees are finding other means to access the data, often leading to security risks or the development of “shadow IT” (using applications not sanctioned by the agency’s IT department). The study found that one in three survey respondents admitted to using personal devices to access information, and one in four look for workarounds, such as Google apps.

In the last month, 70 percent of respondents experienced downtime of 30 minutes or more, the report found. During this downtime, 90 percent of participants believed efficiency and productivity were negatively impacted. Even though agencies understand the risks of downtime, remarkably only 19 percent of respondents believe their organization is fully prepared to meet their most critical uptime and failover service level agreements (SLAs) and 31 percent of agencies have a continuity of operations (COOP) plan that is either not efficient or non-existent.

Tittermary provided an example of some of the challenges often faced by IT administrators in reducing downtime. “There was an interesting scenario with a three tier application,” he said. “There was a web front end, applications in the middle, and a database back-end – pretty standard stuff. And the agency had some communication issues between the tiers, and each individual tier actually had a pretty good and solid idea of how to provide availability for that tier.”

The IT administrators for each tier were able to monitor their own tier, and provide the necessary tools to their team. But what the team lacked was the ability to see the solution end-to-end. “They didn’t have anything for if the database went down. Since it’s an application, there are three separate buckets with their own systems and personnel methodologies. But the database is not a separate thing from the web tier and the app tier. So if something were to happen to the database, and a reboot has to happen or a restart, there’s an effect on the other tiers as well,” said Tittermary.

The client did not have any outside communication between each tier, because they didn’t have any kind of solution that could map across all three tiers.

“We had to make sure that things were brought down and brought up in the right order, and make sure if there was an outage in one tier, the appropriate steps were taken in the other tiers to maintain the availability of the users in the application,” continued Tittermary. “So, in our storage foundation and throughout, we had our clustering software in every individual tier, which helped manage those interactions for the whole application.”

This case study is an example how Symantec is helping agencies data centers become more reliable and provide users access to data when demanded. “[Symantec] can help you figure out what data really should stay in your core datacenter, and what data you can utilize another datacenter for, by creating a risk profile of the data,” said Tittermary.

Symantec_logo_horizontal_2010

Leave a Comment

Leave a comment

Leave a Reply