Cloud platforms have established themselves across the enterprise space as an important part of doing business. Whether private, public, or hybrid, many large-scale enterprises have a plan in place taking advantage of cloud deployments.
However, as is always the case, new technologies have evolved that are designed to allow for more detailed control over established processes. With cloud, two of the latest have been containers and serverless computing.
Both take the established concept behind virtual machines (VMs) to the next level, promising greater efficiencies and lower operating expense. Containers are seen as further along the evolutionary scale in terms of development and adoption. But, serverless computing is beginning to garner a significant amount of buzz for the same reason.
Case for Containers
Containers can save cloud resources by virtualizing the operating system, packaging up an application, including all its dependencies, in a file system to isolate them from the underlying infrastructure.
There is momentum building for containers, with enterprises focusing on using microservices architecture as a platform for managing large, distributed applications with agility and flexibility.
A 451 Research report from earlier this year forecasts the container ecosystem to grow from $762 million in revenues last year to nearly $2.7 billion in 2020. Forrester Research estimates that 31 percent of all enterprise IT organizations have already deployed containers.
One of the more striking takeaways from the survey was the increased use of containers, which surged from just 8 percent in 2016 to 45 percent this year. Of the 55 percent of respondents not currently using containers, 45 percent said they expect to make the move in the next year.
“That will set the stage towards the future, where containers become the new dominant application platform to be managed, instead of virtual machines,” the report noted.
Rob Szumski, product manager at CoreOS, explained that containers can have a lifespan as short as five minutes. He said this means their drain on cloud resources can be much more controlled than that of VMs, which often times can “stick around for months.”
However, timing and familiarity remain challenges for container adoption.
Dustin Kirkland, VP of product at Canonical, said containers are still relatively new in the eyes of enterprise customers who have only recently come to understand the benefits of VMs.
“It’s a boring challenge, but containers just need more time in front of enterprises,” Kirkland said. “New technology needs time to be adopted. Even though it seems like Docker sort of came out of nowhere to become this big, new thing, it really has taken some time for adoption.”
Jay Lyman, principal analyst for cloud management and containers at 451 Research, said enterprises are also burdened with legacy platforms that are likely to require more time and care in transplanting into a container environment.
“That’s going to take some time,” Lyman said. “We will probably see containers over time replacing VMs as the cloud platform of choice, but there will also be plenty of containers running on top of VMs.”
Space for Serverless
With containers viewed as a more streamlined version of VMs, serverless computing is seen as a trimmer iteration of containers. While containers are seen as ideal for applications that might need several minutes to run, serverless computing can handle applications that need only seconds.
Serverless computing is designed to reduce the amount of overhead associated with offering services in the cloud. This includes the ability for a cloud provider to dynamically manage server resources.
In reducing potential waste and taking advantage of their diminutive resource needs, serverless computing also allows for pricing models that typically break down access by tenths of a second.
Serverless computing has been around for some time, with Canonical’s Kirkland noting he used to dabble in early versions of severless in college to run ticket-purchasing applications.
More recently, serverless computing gained traction following the launch by Amazon Web Services (AWS) of its Lambda platform. Lambda allows an organization to pay only for the compute time consumed – there is no charge when the code is not running.
In a recent report, 451 Research noted serverless compute pricing is typically based on three parameters: script duration, or how long the code is used; the number of requests; and the memory required for the function.
The report favorably compared the total cost of ownership (TCO) of serverless computing to that of VMs. Owen Rogers, research director at the analyst firm, explained VMs needed to be up and running before a function request was placed, thus an enterprise would “need to pay for that, and there is an element of waste when capacity is not being used.”
“With serverless, we don’t have this problem,” Rogers said. “It scales instantly with a request. You just need to configure the request with the function and no other concerns.”
Analysts have noted that these potential cost savings have garnered attention from enterprises.
“There is a lot of interest in serverless as it’s in tune with consuming and paying for exactly what I need and not any more,” said Clifford Grossner, senior research director and advisor at IHS Markit.
Serverless computing is also seen as potentially more secure than VMs and containers, an important consideration for risk-averse enterprises.
Guy Podjarny, CEO and founder of security firm Snyk, broke down the pros and cons of going serverless in a recent blog post.
On the pro side, Podjarny cited the absence of unpatched servers and thus the presence of vulnerable binaries; that denial of service attacks become billing issues; and serverless “immutability eliminates compromised servers.”
“Serverless moves the responsibility for server management from the application owner to the platform provider,” Podjarny said. “These pesky servers are notoriously hard to secure, but the experts managing the platforms handle it quite well.”
On the flip side, Podjarny said concerns include greater difficulty in monitoring security; the potential for a larger attack surface due to the increased flexibility of serverless; and challenges in securing third-party and services during transit.
While the serverless computing hype is strong, the time horizon for enterprise adoption is viewed as further out than that of containers.
“Serverless is a very interesting and appealing option for enterprises, but still probably two to three years away on the enterprise IT radar,” Lyman said.
A Place for Both?
With containers and serverless computing options somewhat occupying a similar position in terms of cloud utilization, could there be room for both?
Some have offered up the notion that containers and serverless are likely to coexist, with serverless functions perhaps running on top of more mature container deployments. This would echo current ideas around running containers in established VM environments.
Marc Woolward, CTO at cloud security provider vArmour, said serverless is a “really good application for containers.”
“Deploying a serverless app on top of containers can be powerful,” Woolward said.
Kirkland cited a number of vendors already running serverless computing deployments on top of their Kubernetes container orchestrators, including Galactic Frog and Bitnami.
“Once the Kubernetes is up and running, you can add an app to schedule these functions as a service on top of Kubernetes,” Kirkland explained. “I like that those are additional solutions on top of a simple Kubernetes base as opposed to integrated into the Kubernetes itself. It keeps Kubernetes simple.”
With containers and serverless computing still continuing to evolve, it would seem there final place in the ecosystem remains unwritten.