Serverless computing may still be years away from mainstream adoption, but potential security advantages of the platform could help speed the process.
Serverless computing architectures are designed to reduce the amount of overhead associated with offering services in the cloud. This includes the ability for a cloud provider to dynamically manage server resources.
As part of their dynamic nature, serverless architectures are often likened to a function-as-a-service (FaaS), or a more nimble version of containers. Serverless applications are designed to activate almost instantly and can be programmed with a finite lifespan that cuts off all activity once a function is completed.
Owen Rogers, research director at 451 Research, explained that compared with traditional virtual machines (VMs) that need to be up and running before a function request is placed, serverless allows for more intimate control.
“It scales instantly with a request,” Rogers said. “You just need to configure the request with the function and no other concerns.”
This agility is viewed as a benefit in terms of security.
Guy Podjarny, CEO and founder of security firm Snyk, broke down the pros and cons of going serverless in a recent blog post.
On the pro side, Podjarny cited the absence of unpatched servers and thus the presence of vulnerable binaries; that denial of service attacks become billing issues; and serverless “immutability eliminates compromised servers.”
“Serverless moves the responsibility for server management from the application owner to the platform provider,” Podjarny said. “These pesky servers are notoriously hard to secure, but the experts managing the platforms handle it quite well.”
Serverless platform providers include Amazon Web Services (AWS), Microsoft Azure, IBM OpenWhisk, and Google Cloud Platform.
On the flip side, Podjarny said concerns include greater difficulty in monitoring security; the potential for a larger attack surface due to the increased flexibility of serverless; and challenges in securing third-party and services during transit.
Timing is Everything
Rich Sharples, senior director for product management at Red Hat’s JBoss Middleware division, noted the short shelf life of a serverless application is a boon for security. This is helped by serverless functions generally only being tasked with a single function that should only require a short amount of time to complete.
“The way serverless is executed it makes certain kinds of exploits difficult,” Sharples explained. “You can optimize a function so that if it does not finish running in five seconds it gets killed. This removes potential attacks from sophisticated exploits that take hours or days to evolve.”
Sharples also said the use of an application programming interface (API) gateway can handle many of the security tasks for serverless deployments. An API gateway can expose a serverless function without the traditional management of an API.
As with many cloud-based security challenges, Sharples placed basic responsibility onto the shoulders of programmers in using good practices.
“Serverless solves a lot of problems and doesn’t create many new problems,” Sharples said. “But, what it does not get rid of is that you can still deploy bad code. Developers still need to make sure they are putting in good code or all security advantages go out the window.”