IBM launched quasi-support for Knative as an “experimental managed add-on” to its Cloud Kubernetes Service. The move shows growing confidence in the commercialization of the Kubernetes-based serverless architecture.
The support will allow organizations to deploy Knative in a Kubernetes cluster using a one-click install via the Cloud Kubernetes Service user interface. The deployment model also works for the Istio service mesh platform.
Doug Davies, offering manager for Knative at IBM, did warn in a blog post that the implementation is still in the “experimental stage since the project is continuing to mature and evolve.” But, he added that as the maturation occurs, “IBM Cloud will provide even deeper integration with the rest of the IBM Cloud platform.”
Jason McGee, vice president and CTO for IBM’s Cloud Platform, told attendees at the recent KubeCon + CloudNativeCon North America 2018 event in Seattle that Knative was an important project in unifying the dozens of serverless platforms that have flooded the market.
“That fragmentation, I think, holds us all back from being able to really leverage functions as part of the design of our applications,” McGee said during his keynote. “I think Knative is an important catalyst for helping us come together to bring functions and applications into our common cloud native stack in a way that will allow us to move forward and collaborate together on this common platform.”
He added that Knative also teaches Kubernetes how to deal with building and serving applications and functions, which makes it an important piece in the cloud-native landscape.
IBM was one of the initial Knative developers, which launched last July. Other companies involved in that development included Google, Pivotal, SAP, and Red Hat.
Knative was developed as a way to provide an open source set of components that allow for the building and deployment of container-based serverless applications that can be transported between cloud providers. It’s specifically focused on orchestrating source-to-container builds; routing and managing traffic during deployment; auto-scaling workloads; and binding services to event ecosystems.
The platform is targeted at unchaining current serverless development platforms that are tied to their respective cloud parents. These would include hosted services like Amazon Web Services (AWS) Lambda, Microsoft Azure Functions, and Google Cloud Functions.
Knative’s continued maturation, as Davies mentioned, should start to accelerate in the near term. The platform maintainers last month announced a schedule that will see new releases based on a six-week cycle.
“As we move to a more predictable release schedule based on [a] six-week cadence, Knative releases will now be smaller and more frequent,” explained Mark Chmarny, who is the technical program manager for serverless at Google Cloud, in a blog post. “We did this to enable a tighter feedback loop with our users and allow for smoother course corrections as we continue to learn from our growing number of users.”
The Knative community unveiled the 0.3 release last month, which was about three months after the previous release.
A number of vendors have jumped on the Knative bandwagon. The most obvious were those that participated in its initial development. Pivotal, for instance, launched its Pivotal Functions Service (PFS) that uses components from the Knative project to manage deployment and operation of serverless functions across private and public cloud providers.
Knative has also attracted others to the ecosystem that see an advantage of using Kubernetes as a way to manage serverless deployments. As an example, startup TriggerMesh launched late last year with a serverless management platform that runs on top of Knative. That platform has already been tapped by GitLab to power its serverless product.