Automated Backups via Rclone (Cloud Storage: GDrive, S3, Dropbox, FTP, +More)

Automated Tarball Backups via Rclone

Goal: Automatically compress select directories and upload tarball to remote storage via rclone.

Sharing my small script that I finalized, which may come in handy for others. Prior to doing automated backups, I typically backed up important data manually, as needed. This has come to bite me in the ass on a few occasions with not having the latest backups or not having what I really needed in my latest backup.

The gist of the script is that you make two files, the backup-include.txt list and the backup-exclude.txt list. Everything thatā€™s included in the include list (one directory per line) is compressed and added to the tarball. Everything thatā€™s included in the exclude list is blocked from being added to the tarball. An entry to the exclusion list is only necessary if the parent directory is in the included list, but you donā€™t really want everything thatā€™s in that directory to be backed up. For example, if I have ā€œ/home/mason/hostballs-filesā€ as a line in the inclusion file, but I donā€™t want to back up Jarlandā€™s cat videos, then Iā€™d add ā€œ/home/mason/hostballs-files/jars-kitty-vidsā€ to the exclusion file. Thus, the entirety of the hostballs-files directory will be in the tarball, with the exception of Jarlandā€™s cat collection.

How to Setup/Run

  1. Install rclone on your server
  2. Configure an rclone remote endpoint (rclone config)
    2a. (recommended, but optional) Set up an encrypted endpoint using the previously added remote. Be sure to either record and save your password and salt strings or save a copy of the generated rclone.conf file
  3. Create a backups directory (not within one of the directories to be included in the backups) (sudo mkdir /backups)
  4. Save the script below as backup.sh (sudo vi /backups/backup.sh)
  5. (recommended, but optional) Move the rclone.conf file into your backup directory (sudo mv ~/.config/rclone/rclone.conf /backups/rclone.conf)
  6. Create and add whichever directories are to be included/excluded to the respective files.
    6a. sudo touch /backups/backup-include.txt /backups/backup-exclude.txt
    6b. Add all directories to be included in the backup to backup-include.txt
    6c. Add all directories to be excluded (whoā€™s parent directory is within the inclusion list) from the backup to backup-exclude.txt
  7. Edit the script with your information (path to rclone config file, rclone remote name, backup path within remote)
  8. Change permissions so only root has access to your rclone conf file and can run backups (sudo chown -R root:root /backups/ && sudo chmod -R 700 /backups/)
  9. (recommended, but optional) Give the script a test run to make sure everything runs correctly and check remote storage to make sure the backup was transferred sucessfully (sudo su -c 'cd /backups && ./backup.sh)
  10. If successful, add to root crontab using desired backup interval (ex. to run every day at midnight, add 0 0 * * * cd /backups && ./backup.sh)

The Script

#!/bin/bash
# Automated Tarball Backups via Rclone by Mason, 13 Nov 19
# Tested on Ubuntu 18.04, GNU tar v1.29
# Note: rclone needs to be installed and remote endpoint configured

HOSTNAME=`hostname` # replace with desired hostname, if not set
RCLONE_CONF="/backups/rclone.conf" # path to rclone config file
RCLONE_REMOTE="my-remote-crypt" # rclone remote name to store backup
RCLONE_REMOTE_PATH="/backups/$HOSTNAME/" # path to store tarball on remote
NAME=$HOSTNAME-backup-`date +%d%b%y`.tar.gz
EXCLUDE_STRING=( )

# make sure include and exclude files exist
touch backup-include.txt backup-exclude.txt

# build "--exclude" options using list of directories to ignore
# (only needed if parent dir of the excluded dir is being included)
while IFS="" read -r p || [ -n "$p" ]
do
        EXCLUDE_STRING=( "${EXCLUDE_STRING[@]}" --exclude "$p" )
done < backup-exclude.txt

# captures error code to prevent some non-error warnings from
# killing the script prematurely (i.e. "file changed as we read it")
set +e
tar -cvzf $NAME "${EXCLUDE_STRING[@]}" -T backup-include.txt
exitcode=$?
if [ "$exitcode" != "1" ] && [ "$exitcode" != "0" ]; then
        exit $exitcode
fi
set -e

# upload backup to rclone remote and remove file
rclone --config=$RCLONE_CONF copy $NAME $RCLONE_REMOTE:$RCLONE_REMOTE_PATH
rm $NAME
19 Likes

You posted this right as I was about to start backing up one of my servers lol

1 Like

Hehe perfect timing, then! Hope it helps, give me a shout if you have any questions.

Itā€™d also be trivial to swap out rclone with some other method of transferring the backup (such as, rsync after setting up SSH certs to keep things automated), if rclone isnā€™t your cup of tea :slight_smile:

Nice writeup @Mason :slight_smile: I used to use a script similar to this, then I discovered Duplicati :stuck_out_tongue: It has built-in support for cloud storage and it supports deduplication. Itā€™s made my life easier :sweat_smile:

3 Likes

Itā€™s like bourne threw up a little bit in my mouth.

1 Like

Looks nice. What does rclone bring to the table? (Iā€™ve mostly used rsync ā€¦)

Disclaimer: Yep, I was to lazy/busy to Google it right now :innocent:

1 Like

rclone is self-described as ā€œthe rsync for cloud storageā€. Best way to view it is as rsync, but with a wrapper enabling it to talk/sync with a ton of cloud storage options/protocols (i.e. 1fichier, amazon drive/s3, b2, box, dropbox, ftp, gdrive, mega, own/nextcloud, wasabi, webdav, and a bunch more).

As I stated earlier, though, if you keep backups local/just move them to other machines, then the rclone line can be swapped out in favor of rsync within the script.

Heh. Only ā€œbourne againā€ shellers would understand.

I actually prefer zsh, but I use bash these days just because itā€™s always there. :man_shrugging:

The saying ā€œBourne Againā€ was born after a couple of sequels of Jason Bourne.

I think that @Mason was alluding to the fact that ā€œbashā€ was originally an abbreviation for ā€œBourne again shellā€. In this case, the Bourne was Stephen, not Jason.

Yeah, I was trying to make a ā€˜Born Againā€™ Christian joke (but using bashā€™s ā€œBourneā€), but it didnā€™t come out too well :stuck_out_tongue:

Indeed, it was a play on (the Christian) ā€œBorn againā€ that was at the origin of bash as the ā€œBourne again shellā€. :slightly_smiling_face:

1 Like

Seriouslyā€¦ you didnā€™t get my joke

But I did get it! J. Bourne, J. Bourne sequel, J. Bourne sequel sequel, ā€¦

I was just pointing out that ā€œBourne againā€ was born much earlier.

(I thought that you had missed @Masonā€™s reference, but perhaps you hadnā€™t.)

1 Like

The script is helpful for automatic backup, I prefer using something like Borg since it support encryption and deduplication.

1 Like

+1 for zsh. Nice script btw @Mason

1 Like

Every day, I fall in love with your scripts. Again and again.

1 Like

Would this work on Proxmox?

Yes, however, it may need some tweaking depending on what you want to do.

Iā€™m backing up a couple of my proxmox hosts with the following list of directories in the backup-include.txt file. This is where most of the configuration files and proxmox setup is located.

/home/me
/etc/pve
/var/lib/pve-cluster
/var/lib/pve-firewall
/var/lib/pve-manager
/etc/dhcp
/etc/nginx/sites-available

If you also wish to backup VMs running on your proxmox host, then Iā€™d suggest adding some lines to the end of the script to backup the VMs you care about (i.e. vzdump {vm_id} --mode snapshot). And then use rclone to sync/upload the backups to your remote destination (i.e. rclone --config=$RCLONE_CONF copy /var/lib/vz/dump/* $RCLONE_REMOTE:$RCLONE_REMOTE_PATH/vms/).

1 Like