- Analysts are not employed by SDxCentral.
- Views and opinions expressed in analyst content belong solely to the author and do not reflect the views of SDxCentral, LLC.
- SDxCentral does not fact check analyst content. If you believe there is a factual error in analyst content, please notify email@example.com. Should we find factual irregularities, that article will be unpublished from the SDxCentral website.
Effective April 18th, 2019, the SDxCentral analyst blog syndication program has been terminated.
SDxCentral Statement about AvidThink, LLC
- Roy Chua, the founder of AvidThink, was a co-founder of SDxCentral. As of September 30, 2018, Roy is no longer affiliated with SDxCentral.
- The views expressed by AvidThink and Roy Chua are independent of SDxCentral and do not represent the views or journalistic principles of SDxCentral.
- As of April 18th, 2019, SDxCentral is no longer publishing AvidThink analyst blogs on the SDxCentral website.
There has not been much progress made in the way of developing network virtualization (NV) standards – nor does it appear any will be made anytime soon.
The Open Networking User Group (ONUG) has been trying to spearhead an Open Interoperable Control Plane (OICP) initiative. It looks to address interoperability issues, not only among NV implementations, but also between network overlays and the physical underlays they run on as well as between on-premises and cloud implementations of NV software.
But ONUG founder and vice chairman Nick Lippis says that effort has been stymied by a lack of participation from the vendor community.
“Nobody on the vendor side is willing to work on this,” says Lippis.
Engineering Network Virtualization Standards
ONUG has managed to publish a white paper outlining the standards issues that need to be addressed. But any actual engineering work needs to be left to either the vendors or some other standards body willing to take on the project.
“It usually takes four or five years for a standard to finally be developed,” says Shenoy.
That’s a problem for enterprise IT organizations, which usually wait for standards to be in place before implementing any technology on a larger scale.
Carlos Matos, director of global network architecture for Fidelity Investments, who also serves as chair of the OPIC committee created by ONUG, says that absence of standards will limit the investments across the entire NV category.
“We don’t tend to like to make investments on a large scale unless there are standards in place,” says Matos.
But Weaveworks CEO Alexis Richardson says that standards only matter to a segment of the overall market. He says developers and organizations that tend to adopt open source software will be the driving forces behind NV adoption.
“There are really three market segments,” says Richardson. “Larger enterprise organizations are the ones that will wait for the standards.”
Richardson says the most relevant NV interoperability work being conducted now is being done by the Internet Engineering Task Force (IETF), but he concedes that it will be years before anything approaching a standard emerges.
Seeking Network Virtualization Standards Leverage
In the meantime, vendors are clearly jockeying for leverage before any substantive Network Virtualization standards conversations take place. Those with less market share often pool their influence to try and create any open standard. After several years of wrangling, multiple parties usually come to some agreement concerning a standard. How relevant that standard is after the market has already voted with its wallet in favor of one vendor or another is debatable.
In the case of NV technologies, however, enterprise IT organizations may not have the time for all those niceties to play out. Most of them are under pressure to create more agile IT environments. The part of those IT environments, however, that is the least agile is clearly the physical network.