Docker and why you should run containers on your home automation hub

Docker and why you should run containers on your home automation hub

Running software inside isolated containers is incredibly powerful. Facebook and Google use containers to the extreme, firing up 2 billion containers every week! For a home automation hub, that is a little overkill, however, the benefits of containerisation are equally applicable to the many micro services required in a complex home automation system.

My home automation hub is a Raspberry Pi 3 (Model B), running a number of containers:

  • Home Assistant (central home automation hub, the intelligent heart of the setup)
  • Mosquitto MQTT Broker (allowing smart devices to communicate on the network)
  • Snips (On device voice voice control)
  • MongoDB (Database server)
  • Leanote (Note taking application server)
  • Spotify Button Backend (my own Nodejs/Express server, more on that in a future post)
  • HA Tool (A home assistant development tool written in AngularJS, Check out the demo)

This is the result of a recent restructuring of my home network architecture—an attempt to simplify the set up, making it easier to maintain, whilst also reducing power consumption and leveraging the Docker technology.

If you with to benefit from the advantages Docker has to offer, check out my installation guide for Raspberry Pi (includes installation of Docker Compose)

The following sections discuss the advantages of running containerized applications using Docker.

  • Reducing number of physical devices
  • Making efficient Use of available Hardware Resources
  • Separation of Concerns, Container Cohesion and Isolation between containers
  • Ease of deployment
  • Easy software upgrades

Docker Terminology

Docker uses a few terms with specific meanings:

Dockerfile This is the blueprint used to generate an image. Dockerfiles follow a syntax that allows you to specify a base image, install dependencies, expose ports, and set the command to run inside the container.

Image Once a Dockerfile is executed using the docker build command, an image is generated. A new image is generated for each command in the Dockerfile. This allows image caching, which comes in useful when your Dockerfile installs dependencies that do not change when the Dockerfile is built.

Container A container is the ‘live’ version of an image. We can spin up containers using the docker run <image> command and manage them using docker stop <container name> and docker start <container name>

Reducing number of physical devices

The reason I looked into Docker and containerization in the first place was in an attempt to reduce the number of physical computer devices on my network. Each device used to have a single purpose, from Raspberry Pis running an Owncloud instance to Arduino microcontrollers acting as a RF MQTT gateway. When running multiple devices, each device requires their own power supply, or leeches power via a USB port on another device. This adds redundancy and uses lots of electricity as power bricks/transformers get warm whether under load or not.

The other problem was the need to manage and maintain each seperate device. This comes with the inconvenience of having to flash firmware to Arduino’s or having to maintain the same Samba/SSH setup on multiple Raspberry Pis running the same operating system. The disadvantages are summarized below:

  • Certain services must exist on each device (SSH access, file share) in order to access and maintain the device. Installing and maintaining these services adds additional overhead and configuration duplication.
  • Maintaining each device, updating software packages and keeping it running smoothly with the rest of the system is time consuming
  • Power consumption, not just by the device but also the power adapter generating heat. These devices are plugged in and running 24/7. It pays off to think about reducing power consumption.
  • Aesthetic reasons: space constraints, ugly wires, using up power points, difficult to hide

Docker removes these complications because all services that previously required a dedicated device can be virtualized on one physical machine. This machine runs one operating system (Raspian), one SSH server and exactly one Samba File Share, reducing the overhead of duplicated admin services. Maintaining docker containers is simple too, as I will cover later. This central Docker hub is running on a Raspberry Pi hidden away behind books on a shelf along with the modem.

Making efficient Use of available Hardware Resources

Running a number of physical devices cannot make efficient use of the available hardware. Say we are running 3 devices, the processors of which run at 10% utilization and RAM is at a similarly low usage rate. From a cost efficiency perspective1, it is better to run a single hardware device at 90% utilization than multiple devices at low utilization. If the hardware has the capability to run multiple containers then it is better to make efficient use of the available hardware resources.

Separation of Concerns and Isolation between containers

The reason containerization reduces maintenance cost (or time required to maintain the setup) is because:

  1. Separation of Concerns: Containers have a clearly defined purpose. They provide one service, whether that is a database server or home automation hub. If we require multiple services, we simply start mulitple containers.
  2. Isolation: Containers can be treated as black boxes. The interal logic is not relevant to the user2.

Docker containers have clearly defined external interfaces which allow them to communicate with the “outside world”. These interfaces come in the form of file shares and TCP ports exposed to the host machine. It is this restrictive interface and isolation of the container contents which makes Docker containers so powerful. We can treat a container as a black box and all we effectively care about is the service it claims to provide3.

Easy software updates

Since each container is completely isolated from the host OS, updating the software within a container is as simple as downloading a new image. If your container is based on the image’s :latest tag, then the latest image is automatically downloaded every time your container restarts. This elimitates the problems associated with traditional software upgrades, which includes version conflicts, resolving compile errors and installing new dependencies. A Docker image is a completely self-contained installation, including all dependencies and tools required to run it. All this comes pre-packaged in an image, ready for download and use.

Ease of deployment

Gone are the days of installing databases, development packages, web servers, cache stores and supporting services manually. The result was a single OS that required lots of effort to set up and its services are tightly integrated into the operating system with files spread all over the directory tree. Maintaining or replicating this setup in case of a hardware failure took days.

Docker makes deployment of new software as easy as running a docker run command, followed by the name of the service to run. This speeds up deployment. Migrating a docker hub to a new machine should be simple as well. Simply run your docker-compose files on the new machine and your docker containers are recreated with the correct container links and dependencies.

Conclusion

We explored the various benefits of containerization in this post and how it reduces not only the time required to maintain a home automation setup but also the cost of running it through more effective hardware utilization. I would be interested to hear what services you are running inside a container? Please share in the comments.

  1. I haven’t done the actual maths, but it makes logical sense. The relationship between resource utilization and power usage may be not be as assumed in this post, but it will do for the sake of the argument. ↩︎

  2. The user can be a person accessing a website from a containerized webserver, or another computer accessing a database container is not relevant. ↩︎

  3. I say “claims to provide” because it is important to pay attention which Docker images you download. For the same reasons outlined in this section, it is easy for people to package additional software with their Docker image. This software performs extra-curricular tasks such as mining digital currencies or routing internet traffic through your home network. These infected Docker images effectively use your computing power to perform tasks you did not sign up for. Always use official images or images you trust for your Docker projects. ↩︎