How many times have you tried installing an application and failing because of mismatching dependency versions? Dependencies by themselves are cumbersome enough, but having to deal with different version requirements on a single machine may even be a show-stopper. Nowadays virtual machines (VMs) are widely used to solve this problem by emulating an operating system (OS) both to run the application in a contained scope and only performing the desired task. Running an OS for every single application may not be a problem for production servers – where cost is not as likely to be an issue. But for developers working on their local machine where hardware resources are limited, this may not be feasible. Even when using a VM to isolate an application it is still required to install and configure each environment and the dependent applications.

whatisdocker_2_vms_0-2_2 whatisdocker_3_containers_2_0
This is where Docker saves the day. Docker is an open source containerization platform; instead of emulating an OS for every single application, Docker uses the hosts kernel. This means we only need one OS (the host), no matter how many applications we are running concurrently. Docker handles setting up the environment, installing dependencies and running the actual application. This eliminates inconsistencies when running our application; whether running in local, staging, or production environment. Setting up a new environment is done in minutes, using an image to create a container, with each application isolated to their container which leads to less interference between applications while providing a simple approach of sharing network and storage.

Okay, you may be asking yourself, what is an image or container? To simplify this, we can make an analogy with object oriented programming. Think of an image as a class and a container as an instance. An image can be described as a set of instructions resulting in an immutable state. As a basic example, let’s create an image running a Node.js application. We do this by first creating a file (Dockerfile) with the following instructions:

FROM node:latest


COPY package.json /app
RUN npm install
COPY . /app

CMD [”node”, ”index.js”]

So, what does this do? On the first line, we specify that we want use the official node image as a base image, which has Node.js installed and configured. We then specify our container working directory and copy the package.json file from our host to the container. Then we run a shell command installing the dependencies specified in the package.json. We then copy our application files to the app directory in the container. Finally, we specify the command that should be run when starting a container from this image.

In the example, we are using a base image pulled from the Docker Hub registry. The base image itself depends on other images. Each instruction that is specified in the Dockerfile results in a layer in the image. An image can therefore in theory consist of infinite layers. Different images may also share layers, saving storage and bandwidth.

Docker offers their own registry service called Docker Hub where it is possible to find images of most open source applications as well as publishing your own images. It is also easy to host your own registry service, simply use the official image provided on Docker Hub which sets this up for you. If that is not enough to blow your mind, it is even possible to run Docker inside a Docker container!

After playing around with Docker for a couple of weeks I can definitely see the benefits and potential of using Docker. With a number of large companies like Spotify, eBay and Uber using Docker today to handle hundreds of different applications in their micro services architecture and their continuous integration pipeline, Docker is most likely not just a hype. Docker may not always be the best fit for a project, however, in my opinion it is certainly worth looking into and evaluating.

Before actually having any practical experience, Docker felt very abstract and hard to grasp. I hope that you’ve gained some basic understanding of the concepts how Docker works and maybe the urge to try it out yourself.