Skip to main content

linux cloud backup setup

Requirement is to go from my almost non-existent/ manual backup "strategy" to something simple and robust that makes use of cloud storage I already have. I have very few local files that I couldn't do without or couldn't recreate but I realise it would make life easier if I had a few constant folders across my computers that were also occasionally backed up off-site.
Rough plan is below:

sync of key data folders requirement:

  • Sync of key data folders between laptop and desktop over an always-on vpn connection.
  • Initiate from laptop as it will be offline more than the desktop.
  • Run hourly on a cron job should be more than enough.
  • Pull the list of folders from a text file so it's easy to add/ remove as required.

Encrypted archive weekly backup requirement:

  • Create encrypted archives of key folders in a local backup folder weekly.
  • Initiate from the desktop as it will most likely be up plus has the storage.
  • Pull list of folders from a text file so it's easy to add/ remove.

sync of encrypted archives to cloud storage requirement:

  • sync of the backup folder of encrypted archives to cloud storage (1TB available)
  • Run this after the archives are finished in the previous stage.

unison sync:

I wanted a light, terminal based client that required little or no setup. Unison looks like it will fit the bill although time will tell. It's available in the Ubuntu repo so apt install unison and you're good to go. I already have a wireguard connection between the devices and key based authentication setup, this meant my backup script was as simple as:

#!/bin/bash
# create env variable for the cron task to point to my ssh key
if [ -z "$SSH_AUTH_SOCK" ]
then
    export SSH_AUTH_SOCK=/run/user/1000/keyring/ssh
fi
# run sync between folders listed in backupFolders.conf over WireGuard interface
while read syncFolder; do
  unison -auto -batch $syncFolder ssh://192.168.1.111/$syncFolder
done </home/path/to/folder/backupFolders.conf

encrypted archive:

For the archives I decided to use tar and gpg to give me a secure encrypted archive that I can upload to cloud storage later. My hope is I will rarely need these archives as I should have content mirrored across my machines but you never know.

For simplicity I'm just going to use the same list of backup folders, and encryption is just an added layer of security that means I don't need to worry about the files up on the cloud. I'm embedding the password in the script as if someone has access to the machine they have access to the source folders anyway.

#!/bin/bash
# run archive and encrypt of folders listed in backupFolders.conf
# to extract and decrypt later: gpg -d backup.tar.gz.gpg | tar -xvzf -
while read archFolder; do
  # split the folder path to get archive filenames based on the last dir in the path
  filename=`echo $archFolder | rev | cut -d'/' -f 1 | rev`
  tar -czf - "$archFolder" | gpg --batch --yes -c --passphrase BIGLONGSTRING > /home/path/to/target/$filename.tar.gz.gpg
done </home/path/to/folder/backupFolders.conf
# sync archives to cloud
rclone sync /home/path/to/target rcloneProfile:cloudTargetFolder

Rclone sync to cloud storage

After a bit of searching I decided on Rclone for my sync to cloud option. It supports O365 and is terminal based so ticked the key boxes. The config wizard is very straight forward so after installing sudo apt install rclone and then running through rclone config my sync to cloud command is below and I just added it to the end of the archive script above:

rclone sync /home/path/to/target rcloneProfile:cloudTargetFolder
  • This is in place now, I will update this doc with any changes/ updates/ issues.
  • I recently installed Fedora 34 on my laptop which introduced some issues, 1. union was not in the repo and 2. a bigger issue, union is apparently pretty fussy when it comes to syncing between versions. A quick solution was to remove union from my Ubuntu machine and download the static binary from the Github page and copy to /usr/local/bin on both machines.