Setting up SSL encryption using Reverse Proxy on Raspberry Pi

Setting up SSL encryption using Reverse Proxy on Raspberry Pi

As this post title suggest, this gon’ be a major headache from start to finish. This is hopefully an improvement on other tutorials and will make the process of implementing a containerized reverse proxy on Raspberry Pi easier for you.

Setting up a free domain name, Dynamic DNS (DDNS) and Port Forwarding to your Raspberry Pi

If you are thinking about running a website or blog on your Raspberry Pi, there is a real need to make your device accessible from the internet. This post shows how to set up a free, custom domain name for your router along with all the stuff needed to make it work.

The issue with this is that your internet modem is assigned a random IP address and this address can change at any given moment. The traditional way to allow remote access to your modem is using a static IP address assigned through your ISP. This can be expensive and is impossible in some cases.


Before we start, make sure you have the following:

  • An account on, a free Dynamic DNS provider with lots of configuration options
  • get a free .tk domain name at (make sure you sign up, we need admin access later)
  • enable port forwarding on your ISP account to enable TCP port forwarding to your modem
  • have a service running on your Raspberry Pi (or other network device) that you want to make available outside of your home network


  1. Forward external modem port to Raspberry Pi
  2. Set up Dynamic DNS between your modem and the DDNS provider (
  3. Link custom domain name to Dynu name servers
  4. (Optional) map subdomain to port

Step 1: Set up port forwarding on your modem

The instructions for this vary for different modems. The objective is to forward a modem port to your Raspberry Pi port. For webservers this is usually port 8080, however it can vary depending on the application you want to make externally available.[1^] A quick Google search should tell you how to achieve this for your particular modem.

Step 2: Set up Dynamic DNS between your modem and the DDNS provider (

Since your modem’s IP address changes arbitrarily, we need a mechanism to map a static domain name to your ever-changing IP address.

Step 2.1: Create a new DDNS service on Dynu

  1. Click on DDNS Services on Control Panel (or click here)
  2. Click on Add button on top right
  3. Enter you domain name under “Option 2” and click Add

Your DDNS service has been created and your current IP address prefilled. The screen shown below allows you to configure the service. Control Panel for new DDNS service on Dynu

Step 2.2: Update IP address automatically from modem

Most modems these days have the ability to send IP address updates to DDNS providers, notifying them every time the modem is assigned a new IP by the ISP. Again, the instructions for this are modem specific. Dynu’s help section includes instructions for a few common modem brands. If yours is not listed, googling setting up dynamic dns on <modem> should tell you how to do it.

You need your custom domain name (, your Dynu usrname and password and the IP address update link (supplied by Dynu).

Now that we have established a reliable connection between our DDNS provider and our modem, which gives us a static address to access our home network, we need to link our domain name to Dynu’s DNS server1. Doing this is relatively simple. All we need to do is tell our domain name provider to use Dynu’s name servers (listed here).

  1. Log on to your account
  2. Go to the Domains section and edit your domain
  3. Select custom name servers and plug in all of Dynu’s listed name servers into the empty text fields.

Step 4: Map subdomain to specific port

If you have multiple applications running on your Raspberry Pi, each on a different port, then you will benefit from mapping a subdomain to those ports. Say you have notebook server running on port 5055. Rather than having to specify the port manually when accessing your domain ( we can map port 5055 to a custom subdomain such as This way you can have a few subdomains for different services such as, and

Control Panel for new DDNS service on Dynu

We can achieve this by clicking on the Web Redirect link, which takes us to the following page:

Add new Web redirect

Under Node name you can specify the subdomain to use for the application running on your server. Make sure you select “Port Forwarding” as the redirect type. Leave Hostname or IP empty and enter the application port to redirect to. YOu can optionally check the Mask/cloak URL option to hide any query strings from view. This results in the browser displaying the hostname only in the address bar (e.g. rather than These query parameters are application specific. Cloaking URLs is highly undesirable if you want the URL to be bookmarkable by users.


  • make sure your IP address is updated correctly. If you IP changed, and your router information is incorrect, then the DDNS provider will not be notified of the change. This results in your domain forwarding traffic to an outdated IP address. (check on dynu whether the listed IP is correct).
  • If you are experiencing downtime due to unknown reasons, but your setup works perfectly at other times, it is possible this is the result of DDOS attacks. You are especially prone to DDOS attacks if your website is indexed by search engines. The scrapers themselves might be bombarding your server with requests. In this case you need to adjust the scraper/indexing settings.


You should now be able to access the services on your Raspberry Pi using a static domain name with subdomains for each service forwarded to a specific port on your Pi. This is a very neat setup. If you have any problems please get in touch through the comments section.

A note on security: It is important to password protect your applications and use SSL encryption whereever possible. I will cover the installation of fail2ban and certbot in a future tutorial2, which will make the setup more secure.

  1. Dynu will convert your custom domain name to IP address currently assigned to your modem. This bridges the gap of the “unknown IP address”. ↩︎

  2. fail2ban helps prevent DDOS attacks by blocking known attacker IP addresses through traffic analysis. certbot allows the automatioc retrieval and validation of SSL certificates, allowing you to serve content via https↩︎

Automating Docker Volume Backups

Automating Docker Volume Backups

Backing up production databases regularly is very important. I am self-hosting Leanote, an open-source note taking application server and that required some kind of automated daily backups.

Docker Background

Docker stores volumes in the /var/lib/docker/volumes directory. The naming convention for docker volumes is <directory name>_<volume name> where the directory name is the name of the directory containing docker-compose file and volume is the volume name as specified in the docker-compose file. On Linux based systems, each volume directory is directly accessible from a root account. This makes for a simple backup process.

Automating Docker Volume Backups (Basic Backup to Git)

To backup a volume, we can simply compress the volume directory using tar and then back up the archive file to version control1.

To set up automated backups of your important data:

  1. Copy and paste the script below to a new file in your ~/backups directory.
  2. Run git init inside your backups directory (and set up a remote link for external backups)
  3. Run crontab -e and append the following line: 0 1 * * * /bin/bash /home/pi/backups/ This runs our backup script daily at 1am.

Backup Script

#Purpose: Backup docker container/s
#Version 1.0

items=(mongo)                           #space separated list of words. Used in file names only.
vol_names=(leanote_data)                #space separated list of volume names. Same order as items array.

DESDIR=/home/pi/backups                 # backup directory


TIME=`date +%m-%d-%y-%H-%M-%S`

for i in "${!items[@]}"; do
  echo "[$i]: Backing up ${items[$i]} (Volume: ${vol_names[$i]}) -------------------------- "
  echo "     Source:      $SRCDIR"
  echo "     Destination: $DIR"
  sudo tar -cpzf $DIR $SRCDIR
  echo "Content Listing (and integrity test):"
  tar -tzf $DIR
  git add $DIR
  git commit -m "$ITEM backup $TIME"


# Push all commits at the end
git push

This script compresses a given volume, moves the resulting archive to a subdirectory in backups and commits that file to version control. You can use this same script to backup multiple volumes by adding more elements to the items and vol_names arrays.

Congratulations! You can now rest assured that your data is backed up automatically. To confirm backups work, check your git repository or your local mail server. Cron sends output logged to STDOUT to the user executing the script (pi@raspberrypi). If your Cron logs show mail delivery errors, then you need to install postfix.

Access Cron emails using the mutt command (install if unavailable). Mutt provides a simple way to check the script outputs and confirm it is working as expected.

Do not stop here! Try this script in a non-production environment and restore a backup of some test data (see next section).

Docker volume backups to external hard drive and AWS

See the following modified script to backup to a harddrive location and AWS instead. You can set up a lifecycle rule to automatically delete backups older than 30 days. Some sort of lifecycle is required as to not exceed the free usage limits.

The script stops all containers using the specified volume before taking a backup. Run this script periodically using CRON during the night.

#Purpose: Backup docker container/s
#Version 1.1

items=(gitlab_data gitlab_db prometheus_data grafana_data )                           #space separated list of words. Item is descriptive, used in file names only.
vol_names=(gitlab_data gitlab_db prometheus_prometheus_data prometheus_grafana_data)                #space separated list of volume names. Same order as items array.

DESDIR=/mnt/IMATION/backups/ubuntu                 # backup directory


TIME=`date +%Y-%m-%d-%H-%M-%S`

pushd $DESDIR

for i in "${!items[@]}"; do
  echo "[$i]: Backing up ${items[$i]} (Volume: ${vol_names[$i]}) -------------------------- "
  echo "     Source:      $SRCDIR"
  echo "     Destination: $DIR"
  CONT=$(docker ps -a --filter volume=${vol_names[$i]} --format "")
  echo "Stopping container ${CONT} using ${vol_names[$i]}"
  docker stop ${CONT}
  docker run -v ${vol_names[$i]}:/volume -v$DESDIR/$ITEM:/backup --rm loomchild/volume-backup backup $ITEM-$FILENAME
  echo "Starting container ${CONT} using ${vol_names[$i]}"
  docker start ${CONT}

echo "The following files will be removed (older than 7 days)"
find . -type f -name '*.bz2' -mtime +3 -exec echo {} \;
find . -type f -name '*.bz2' -mtime +3 -exec rm {} \;


#/home/daniel/.local/bin/aws s3 sync $DESDIR s3://bucket-name --delete

Restoring a Volume Backup

Before you relax and let your backup script do its work, it is important you convince yourself that the resulting archive contains not only the correct files but that they are picked up correctly by Docker when extracted and moved back into the /var/lib/docker/volumes directory.

Run the backup script, and then use the script below. We can extract this archive using sudo tar -zxvf <archive> command. This reproduces the same directory structure where the files were originally located. In our case, var/lib/docker/volumes/<volume name>. To restore the volume, move the <volume_name> directory into /var/lib/docker/volumes.

# cd into extracted file
cd var/lib/docker/volumes
mv <volume name>/ /var/lib/docker/volumes
  1. This is ok, in my opinion, for small databases up to a few megabytes in size. For larger backups, a remote FTP share would be more appropriate. ↩︎

Docker and why you should run containers on your home automation hub

Docker and why you should run containers on your home automation hub

Running software inside isolated containers is incredibly powerful. Facebook and Google use containers to the extreme, firing up 2 billion containers every week! For a home automation hub, that is a little overkill, however, the benefits of containerisation are equally applicable to the many micro services required in a complex home automation system.

Install Leanote on Docker (and automatic backups)

Leanote is an open-source Evernote alternative with markdown support, making it incredibly useful for programmers. While the notetaking application itself is as feature-rich as you’d hope and an excellent, free alternative, there are honestly no good things to say about its installation experience. This post shows how to (easily) run Leanote in Docker on a Raspberry Pi (all images provided)!

Making Home Assistant speak to RF devices (433MHz) via MQTT Gateway

Making Home Assistant speak to RF devices (433MHz) via MQTT Gateway

Adding an RF gateway to your Home Assistant setup makes your setup incredibly versatile because it opens up a multitude of low-cost automation opportunities. This project enables you to interface your Home Assistant setup with RF devices, allowing you to add cheap RF sensors to your home automation setup!

The Real Motivation behind Home Automation

The Real Motivation behind Home Automation

When it comes to my home automation set up there are a few reactions I get. Most people enthusiastically engage and ask questions, showing genuine interest. Others politely—and rather awkwardly—ignore the fact that my table lamp turned itself on for no apparent reason. Once the initial hurdle of introducing people to home automation is overcome though, I always get the same reaction.

Useful Dev Tools

title: Useful Dev Tools date: 2017-12-02 12:16:31 +0800 categories: [development] —

Useful stuff that I’ve found to be useful and can recommend.

ThingWhat it does
MQTTfxDesktop app that allows you to connect to your MQTT server and listen to messages as well as send them.
Notepad++Quickly open and edit text files from the Windows context menu
WinMergeCompare files and directories and merge changes as required. Life saver when handling version controlled resources.