A Docker container is an open source software development platform. Its main benefit is to package applications in containers, allowing them to be portable to any system running a Linux or Windows operating system (OS). A Windows machine can run Linux containers by using a virtual machine (VM). Container technology has been around for a while, but momentum and hype around Docker’s approach to containers have pushed this approach to the forefront. While it is a major player in the container field, Docker is only one form of container technology. Read this SDxCentral article to learn how Docker containers work.
Docker Containers: Another Form of Virtualization
Think of a container as another form of virtualization. VMs, also just one form of virtualization, allow a piece of hardware to host multiple operating systems as software. VMs are added to the host machine so that the hardware power can be shared among different users and appear as separate servers or machines. Containers virtualize the OS, splitting it into virtualized compartments to run container applications.
This approach allows pieces of code to be put into smaller, easily transportable pieces that can run anywhere Linux or Windows is running. It’s a way to make applications even more distributed, and strip them down into specific functions.
A comparison of how containers and virtual machines are organized. Source: Docker
Docker came along in March 2013, when the code, invented by Solomon Hykes, was released as open source. The company that supports and develops Docker code is plainly known as Docker. The company received $40 million in venture capital funding from Sequoia Capital in September of 2014. The platform consists of Docker Engine, a runtime and software packaging tool, and Docker Hub, a service for sharing applications in the cloud.
Both the Docker open source container and the company’s approach are appealing, especially for cloud applications and general development. This is partly because a container only has the bare minimum software required to run an application, making it an efficient approach to running applications.
The company’s approach also speeds up application development and testing.
Development is faster because multiple teams can work concurrently on small parts of an app that are in different containers. Testing is different because containers can be used as sandboxes to test services without affecting a larger system. The lightweight nature of containers means the approach can also improve the portability of applications. Containers are an efficient and fast way to move pieces of software around in the cloud.
Portability and Scalability
Container technology allows for a much larger scale of applications in virtualized environments, because of the efficiencies of virtualizing the OS. In DevOps and testing, the applications can be built and tested much more quickly.
One downside of open source container technology is that it is limited to use in Linux and Windows environments. When used with applications, containers demand a high level of expertise, because when multiple teams work on small parts of an app, the container-based architecture becomes complex. Container-based apps also scale in and out quickly, making it hard for traditional network and endpoint controls to match the pace and properly secure the containers. Containers pose a security risk as well in that they are a new attack surface in general; more specifically, their APIs and control planes expose application internals.
Updated May 2019