Linux containers virtualize the OS, splitting it up into virtual compartments to run container-based applications or workloads. This allows pieces of code to be put into smaller, easily transportable pieces that can run anywhere Linux is running.
The main appeal of containers, according to many users who have been interviewed about the approach, is that they speed up the process of building, testing, and deploying applications. They also allow applications to be distributed across the cloud and moved around in a virtual fashion. This has made them a key component of the DevOps movement.
Container technology has been around for a while, but the arrival of Docker expanded the appeal by adding a robust platform, code registry, and integrated management tools. This greatly spurred adoption. Docker’s users have been attracted to the use of container technology for agile development in the cloud. In addition, hundreds of millions of dollars have been invested in dozens of container technology startups that are building orchestration, management, networking, and security tools for the container ecosystem.
There are some downsides to container technology: It is of limited use in Linux environments and is relatively new when compared with other enterprise technologies. Also, container technology requires specific expertise and security safeguards.
1:00 pm ET / 10:00 am PT
This webinar, hosted by Ingo Fuchs, Chief Technologist of Cloud and DevOps at NetApp, will address how organizations can adapt to the changing cloud world. Enterprises today need to innovate faster and must embrace digital transformation in order to achieve the business outcomes they are looking for. By leveraging hybrid multi-cloud as a platform for running apps, enabling DevOps, and using containers they can become more agile, allowing them to disrupt rather than be disrupted.