Hurtling Over Agencies’ Network Technology Issues

Mobility, advanced analytics, cloud computing, network function virtualization, and the Internet of Things – these are some of today’s top tech trends. And government is finally catching up to them. Many agencies are implementing these technologies to improve internal efficiencies, reduce cost, and improve engagement between the public and government.

But with these tech and IT modernizations come network challenges in cybersecurity, mobility, bandwidth, network consolidation, and cloud application performance.

In a recent GovLoop survey, 51% of our users said security is driving their network modernization. Subsequently, 35% of users said they are working on their network upgrade right now.

So how does your agency improve network security and flexibility, while still lowering costs and expenses?

In GovLoop’s recent online training, Advancing Your Network to Keep up with New Technology Demands, topic experts Stephen Wallo, Brocade’s Federal Chief Solutions Architect, and Steve Ressler, Founder and President of GovLoop, discussed issues agencies are facing with their network technology – and what can be done to fix them.

First up? The hurdles. There are many problems with agencies’ data and current technology that is causing concerns for network performance. The problems are:

  • Enterprise servers hold pent-up performance. Many agency systems are out of balance. What that means is the hardware is capable of doing much more than it could in the past. It’s advancing to the point where the storage is starting to become the bottleneck. As technology increases on servers, there are more applications added and the rest of the infrastructure is being overrun.
  • New databases demand new levels of storage performance. Just because you have a faster server doesn’t mean the applications can fully perform. A lot of the big applications are doing faster transactions but are also more performance sensitive. So if they don’t get the data in the manor they expect, there are problems that can occur.
  • Virtualization is mainstream. This effects servers and causes an “input/output (I/O) blender.” This occurs when multiple virtualization streams are on the same server and it starts randomly shooting out the I/O into different storage. If you don’t have proper storage, your server can’t function properly. Because your server is being asked to do so many different things at once it spends a lot of time trying to find information in a faulty storage system. Once it finds the information and starts pushing the data out, the server is fine. It’s finding the information on the storage system itself that can cause problems and the I/O blender to kick in.

But there is a way forward. One way agencies can gain insight into what is happening in their network and tech environment is through analytics. Having one platform that manages all of your tech programs can optimize your data. This platform can improve:

  • Automation. This allows you to perform bulk broadcast operations where you utilize multiple devices at once. You can schedule, upgrade or look at your data and see if it’s flowing where it’s supposed to be and where you need to change things.
  • End-to-end monitoring. This allows you to obtain performance statistics. You can move your data around and see what’s working for your agency and what’s not. It also gives you the ability to see if there’s something rogue in your environment. Analytics always you to determine if a data breach is occurring and where it’s coming from.
  • Considerable savings. By monitoring and managing your devices and hardware products from a central location, it allows visibility into the virtual infrastructure. This enables administrators to more effectively manage their storage network resources and minimizes product training.

For more information on advancing your network to keep up with technology’s demands, view the training on-demand here.
Screen Shot 2015-10-27 at 2.35.55 PMimgres-3
imgres-5

Leave a Comment

One Comment

Leave a Reply