Docker Containers Ease Cloud and IoT Implementation

Open-source Docker leverages Linux resource isolation features to deliver lightweight, efficient, and repeatable software delivery on complex infrastructure.

Three items are key in IoT: a thing, a cloud, and the internet for communication. Within this model, the cloud is not only “someone else’s computer,” but in reality, the cloud is a complex compilation of interconnected servers with variable hardware infrastructure, software stacks, and other effects that can be combined in millions of ways if you count all possible settings. IT staff loves Docker since it removes the agony factor to a large extent when setting up or bolting on software components into any infrastructure. Docker eliminates the need to become a pseudo-expert in various packages and hardware settings and isn’t limited to clouds. Docker, an open source project, makes it easier to get software and hardware working together in an efficient, repeatable pattern so developers don’t have to deal with the complexities of servers and storage.[i] Docker is a tool that allows us to innovate without reinventing the wheel and like Linux is the crux of why Docker is so successful.

Create Precisely Defined Containers
Docker is a concise tool that “containerizes” a process or application and isolates it from other applications. A containerized app runs anywhere. Docker simplifies getting an environment up and running on your machine and provides a container system for code and a very consistent way to get code running on a specific set up. Linux containers have been around for years, but many use containers as a kind of tiny server. Docker began as a tool for creating containers for convenient, reliable, repeatable software delivery and is meant to be as easy as possible to use. Docker gets around the fact that software stacks today are run on different frameworks, switching between toolchains of different languages, and running on increasingly complex and diverse hardware infrastructure. Docker allows people to share containers, which has become possible as a critical mass of people that want to share and reuse containers pull them from Docker Hub. The result is a new means of delivering software that is lighter than a virtual machine (VM) yet sidesteps the complexity of setting up the run environment for the recipient. states, “When an app is ‘dockerized,’ that complexity is pushed into containers that are easily built, shared, and run. Setting up to work on a new codebase no longer has to mean hours spent installing software and figuring out setup procedures. Code that ships with Docker files is simpler; dependencies are pulled as neatly packaged Docker images and anyone with Docker and an editor installed can build and debug the app in minutes.[ii]

Figure 1: Left: Virtual machine (VM). VMs run a resource intensive OS and establish a configuration entanglement. Right: Containers can share a kernel. Only the executable and package dependencies are in a container image. Processes run as native and can be managed individually with no configuration entanglements. (Source:

Figure 1: Left: Virtual machine (VM). VMs run a resource intensive OS and establish a configuration entanglement. Right: Containers can share a kernel. Only the executable and package dependencies are in a container image. Processes run as native and can be managed individually with no configuration entanglements. (Source:

Put Simply, How Does Docker Work?
A good analogy for Docker is the shipping industry. In the olden days, individual items were packed on wooden ships and individually unloaded by stevedores at the port of destination. As shipping got more complicated, at some point modular, standardized shipping containers were implemented. Product is placed in a container, sealed, put on the ship, and sent off. A global infrastructure has adapted to this container including ships, portside loading and unloading equipment, train flatcars, tractor-trailer trucks, and even airplanes; all are meant to accept this same container regardless of location, company, or mode of transportation. With Docker, the software developer can hand off his or her container to others to handle; others who are experts in the infrastructure, surrounding software, and tools. The developer need be concerned only with the inside of the box. Docker provides a consistent way to get code up and running on the developer’s machine yet can also move code forward to staging, production, or various testing scenarios.

Figure 2: The entire container market (including containers, virtualization, Private PaaS and others) is expected to hit $2.688 billion by 2020. (Source: 451 Research)

Figure 2: The entire container market (including containers, virtualization, Private PaaS and others) is expected to hit $2.688 billion by 2020. (Source: 451 Research)

Docker is valued at more than $1 billion and is expected to grow in revenue to $2.7 billion by 2020, according to 4512 Research.[iii] Docker was born at dotCloud, a platform-as-a-service company in 2013 out of the necessity to replicate fully portable environments between infrastructures so that dotCloud could help customers. Sharing and assisting others has become more complicated due to multiple hardware platforms, appliances, and other variables such as different versions of software. Something that works on your machine may not work on another’s. A bundle of software typically has a file extension that indicates a specific environment for unpacking and executing it. However, if you ship that software package, you’re not sure what is on the other end. You don’t know the version or settings of the recipient’s environment; all you know for certain is that the software package worked on your machine. Increasingly more complex software stacks make it likely that the recipient’s hardware, operating system, and other environment variables do not have everything that your software package used on your end. To solve this problem, you might have to ship everything surrounding the application, too, down to the build, system configuration, and the version of the application and libraries.

One option is to re-create your entire system by re-creating the environment with a virtual machine so that you have the exact environment for success for the recipient. But virtual systems are very large, on the order of 10 MB or more, and if you’re testing a stack on 12 different virtual machines representing 12 specific environments then you’re going to need more than one personal computer to deploy 12 virtual machines. Virtual machines are terrific, but they do carry a lot of baggage and overhead that can weigh down performance.

Docker is an open collaboration with all parties involved in creating a specific image for a specific stack and hardware infrastructure. Docker takes advantage of Linux evolving such that executing processes in a sandbox (a resource isolation feature) can be achieved concisely because you’re working with low-level primitives in the Linux kernel. As far as Docker knows, it is the only process talking to the Linux kernel and is cleanly isolated. This means that you can deploy applications without worrying about conflicting dependencies with other applications. Docker can run a specific library version, while another application runs another version of the same library without conflict.

Docker for the Embedded Developer
You can install Docker by downloading it (as a static binary) and installing it. It’s initially kicked off as a daemon. After the daemon is running, the client is run, and then commands can be typed that get passed to the Docker daemon. Docker definitely supports the 64-bit x86. The Intel Joule development kit, well-suited to IoT applications, provides a Docker container and uses Docker Toolkit to facilitate it. Another company apparently incorporates Docker into its operating system., a company that “makes it simple to deploy, update, and maintain code running on remote devices[iv]”uses Docker and claims support for the full Raspberry Pi family, Beagle Bone Black, Intel Edison, Intel NUC, Odroid C1+ and XU4, BeagleBone Green Wireless (beta), and several other embedded development boards. According to the site, “The ResinOS is an operating system optimised for running Docker containers on embedded devices, with an emphasis on reliability over long periods of operation…the core insight behind ResinOS is that Linux Containers offer, for the first time, a practical path to using virtualisation on embedded devices. [v]


Figure 3:

Docker is another stepping stone in navigating the next wave of increasingly complex technology. Docker is open source and owes a great deal to the open source community that participates in creating Docker containers, however, Docker does have an Enterprise version with over 400 paying customers. The Docker site includes case studies of Docker use at Uber, MetLife, Expedia, General Electric, Cornell, Ebay, ADP, PayPal, and several others. As for the embedded world, lightweight containers can make test and development faster and can add to security as VMs have done for computers by isolating applications.

LynnetteReese_115Lynnette Reese is Editor-in-Chief, Embedded Intel Solutions and Embedded Systems Engineering, and has been working in various roles as an electrical engineer for over two decades. She is interested in open source software and hardware, the maker movement, and in increasing the number of women working in STEM so she has a greater chance of talking about something other than football at the water cooler.

[i] Levy, Ari. “One of Tech’s Most Ambitious Open Source Projects Puts a Software Veteran at the Helm.” CNBC. CNBC, 02 May 2017. Web. 31 May 2017

[ii]What Is Docker.” Docker. N.p, 15 Apr. 2017. Web. 31 May 2017.

[iii] Buckley, Kaitlin. “451 Research: Application Containers Will Be a $2.7bn Market by 2020.” PRWeb. PRWeb, 10 Jan. 2017. Web. 31 May 2017.

[iv] N.p. Web. Accessed 31 May 2017.

[v] N.p. Web. Accessed 31 May 2017.

Share and Enjoy:
  • Digg
  • Sphinn
  • Facebook
  • Mixx
  • Google
  • TwitThis