How Analytics Are Transforming NV

One of the things that IT organizations often fail to appreciate about network virtualization (NV) is the amount of visibility that can be gained into the overall IT environment. Network overlays typically provide analytics applications with a set of northbound application programming interfaces (APIs) to provide more data than was previously available operating only at the hardware level.

This means analytics are transforming NV. Gaining access to that data is more critical than ever because of the convergence of application performance management (APM) and network performance management (NPM). IT organizations need to be able to correlate data all the way down to specific transactions to provide a comprehensive picture of what any end user is experiencing at any given time. The challenges is that everything from the characteristics of the endpoint to the quality of the network connection can negatively impact the end user experience.

More Data Contributes to Analytics in NV

To better correlate all the data being generated by applications, the network, IT infrastructure and the endpoint attached to the network, IT organizations are now collecting more big data than ever. The expectation is that applying analytics to all the machine data can ascertain what the normal application experience is, figure out what may need to be actually fixed, and then identify any future deviations from that norm in real time.

For this reason, NV, and network functions virtualization (NFV) vendors are increasingly targeting data and analytics in their products. Riverbed Technology recently moved to acquire Aternity to extends its monitoring portfolio out to endpoint. Nik Koutsoukos, senior director of product marketing for SteelCentral monitoring tools at Riverbed, says the agent software created by Aternity will now be able to feed data back to the rest of the Rivebed SteelCentral portfolio of APM and NPM tools.

VMware, Cisco, Others Pursuing Analytics in NV

In a similar vein, VMware in June moved to acquire Arkin Net, which includes in its portfolio an analytics tool for tracking network flows across data centers that have installed VMware NSX virtualization software.

Yet another example of a networking vendor moving trying to turn data into actionable intelligence is Pluribus Networks, which recently launched just extended the VCFcenter network performance and monitoring tools it provides as an application based on its software-defined networking (SDN) platform by adding support for a Big Data repository.

Mark Harris, vice president of marketing for Pluribus Networks, says that repository will make it possible to actually correlate IT events occurring on the network against specific business processes.

Harris says it’s even conceivable that many organizations will wind up getting their first taste of an SDN experience by deploying a more sophisticated approach to network monitoring as well.

The Tetration analytics initiative launched by Cisco is another move in this vein. It will make use of an implementation of Hadoop and the Apache Spark in-memory computing framework to correlate data from all across the data center, including the network virtualization layer. The goal is to not just provide better analytics. Cisco wants to use that data to proactively make recommendations concerning how best to deploy and configure both the networks and the systems attached to them based on the characteristics of the applications running on top of them.

Krista Macomber, a senior analyst with Technology Business Research, (TBR), says given the demands for increased IT agility it’s clear organizations need more visibility than ever into network bottlenecks. In fact, Macomber notes it becomes a lot simpler to embrace a broad range of software-defined infrastructure (SDI) initiatives when the network is no longer viewed as an inhibitor of I/O performance.

How Analytics Are Transforming NV With More Data

Clearly, the implications of being able to holistically correlate data drawn for all the layers of a distributed computing environment are profound. Not only will companies be able to better optimize the flow of data moving across the network, but in many instances they will have visibility into how any given business process is being consumed. That will then enable the business to know better what needs to be augmented from an IT perspective or, just as likely, retired because the analytics make it quite clear it’s not being especially used.

The dirty little secret of IT is that most IT departments don’t really know what applications and services are dependent on what IT infrastructure at any given moment. Because the network by definition touches everything in those enterprises, NV could provide a unique opportunity to create a common layer for collecting big data to gain a level of intelligence that in most cases will be unprecedented.