- Analysts are not employed by SDxCentral.
- Views and opinions expressed in analyst content belong solely to the author and do not reflect the views of SDxCentral, LLC.
- SDxCentral does not fact check analyst content. If you believe there is a factual error in analyst content, please notify [email protected] Should we find factual irregularities, that article will be unpublished from the SDxCentral website.
Effective April 18th, 2019, the SDxCentral analyst blog syndication program has been terminated.
SDxCentral Statement about AvidThink, LLC
- Roy Chua, the founder of AvidThink, was a co-founder of SDxCentral. As of September 30, 2018, Roy is no longer affiliated with SDxCentral.
- The views expressed by AvidThink and Roy Chua are independent of SDxCentral and do not represent the views or journalistic principles of SDxCentral.
- As of April 18th, 2019, SDxCentral is no longer publishing AvidThink analyst blogs on the SDxCentral website.
This article is underwritten by VMware. The underwriter of this article helps fund its creation but it has no control over the specific content of the article.
There’s a growing conflict developing between DevOps departments and IT departments over getting access to IT infrastructure to experiment with applications.
Instead of waiting for internal IT departments to provision IT infrastructure for application development, developers are starting to access application programming interfaces (APIs) through public cloud services.
At the Amazon Web Services Summit in New York this month, Adrian Cockcroft, VP of cloud architecture for AWS, said that the single biggest reason organizations opt for the public cloud today is speed. In the past, developers that requested local infrastructure resources would either be told they were unavailable or would only receive a limited amount of time to use them.
Not surprisingly, developers didn’t want to give up those resources when they were not using them. Instead, they held on to as much infrastructure as possible resulting in wasted resources. Now experts say that the time spent provisioning hardware is eliminated because developers can get access to 4TB machines running on a public cloud in a few minutes. Cockcroft added that soon the same amount of agility will soon possible on 8TB, 16TB, and even 32TB systems.
Plus most internal IT organizations are unable to provide developers with access to a machine just to experiment with some code for a few hours.
This conflict between DevOps departments and IT is just one example of the shifting of workloads to the cloud. And this trend has not been lost on vendors such as VMware. Not only is VMware moving aggressively to make its software available on public clouds such as AWS and IBM Cloud, it’s also turning local data centers into programmable resources via multiple software-defined data center (SDDC) initiatives.
Dirk Hohndel, vice president and chief open source officer for VMware, said that VMware is reaching out to the developer community. Hohndel noted that because very little open source software is written with an enterprise-class data center deployment in mind, VMware has been putting development resources toward making the company’s software more accessible to developers. At the same time, the company is also making numerous contributions to open source technologies such as OpenStack and container projects that can run either on VMware or on the lightweight Photon Linux distribution that VMware is infusing with technologies such as VSAN storage virtualization and NSX network virtualization software.
Brad Casemore, IDC analyst, said winning over the hearts and minds of developers is crucial for VMware. There’s been a clear shift in power towards developers that generally prefer open source software simply because they can download it without having to speak to a salesperson or fill out a purchase order, said Casemore. The shift to Docker containers, Casemore added, is only the latest manifestation of how decisions being made by developers are requiring IT operations teams to adjust to a new reality in which virtual machines may not wind up being the default operating environment.
Regardless of what platform is used, however, the war for control of the cloud is just beginning. Steven Horwitz, CEO of Racemi, a provider of cloud migration software, noted that only 20 percent of the IT market has made the shift to the cloud. Most of what is running in the cloud today would be classified as a greenfield application. The next big wave will be brownfield applications that IT operations teams are lifting into the cloud. Those applications will wind up running on a mix of virtual machines and bare metal servers depending on the biases of the IT operations teams making those decisions, Horwitz said.
Of course, it’s possible many of those application workloads will be moved back and forth between public and private clouds; especially once they are containerized. The challenge now will be fostering some level of agreement between developers and IT operations teams about how those workloads get provisioned, deployed, and managed.