Automated Compression and Remote Transfer Setup?

Hi, I am somewhat suck on how to do this for a custom game server that I am attempting to setup on a VPS.

Basically since this custom game server is utilizing SQlite3 I obviously cannot just “compress and go” with the game’s directory. So I am thinking it may be easier to do the following automatically…

  1. Shut down the systemd service (the custom game server process) to halt everything therefore making the SQlite3 databases “safe” to backup now.
  2. Compress the game’s directory
  3. Bring the game server back online with the systemd service
  4. Transfer the compressed archive
  5. Rinse and repeat

Rather than paying someone a small fortune for specifically for “hot” backing up with the backup API.

So I found a guide for automatic backups https://www.admfactory.com/how-to-automatically-backup-files-and-directories-in-linux/

#!/bin/bash 
TIME=`date +%b-%d-%y`                      # This Command will read the date.
FILENAME=backup-admfactory-$TIME.tar.gz    # The filename including the date.
SRCDIR=/var/www/html                       # Source backup folder.
DESDIR=/backup                             # Destination of backup file.
tar -cpzf $DESDIR/$FILENAME $SRCDIR 

Though I am obviously still a long way from completing everything that needed. Because I am mainly stuck on how to…

  1. Stop and start the process back up only when the game process have actually stopped and compressing been completed respectively. To ensure no damages can occur to the databases.
  2. Transferring the now compressed achieve to remote storage on another network. Then deleting the achieve if successful.

Is it possible to easily setup such a script to do the above safety? Or what is recommended to automate backing up and transferring safety?

Thanks very much in advance.

This should alleviate your first concern.

systemctl stop gameserver.service # stop the systemd service
while true; do
  running_check=$(systemctl is-active gameserver.service)
  if [ "$running_check" == "inactive" ]; then
    break;
  else
    sleep 1 # sleep for a bit before checking again
  fi
done

# do all your tarball/backup stuff

systemctl start gameserver.service # start the systemd service back up
2 Likes

I hate you and your kind (systemd bastards).

2 Likes

You could setup a borgbackup service, add some capabilities if some files can’t be easily read, create a gameserver.target, create a gameserver.timer that stops gameserver.service everyday, t͜he̸n ͏pla̸ce ͝a͝ p̡a̴r̨t́ial͝ ove͞rrid͏e (systemctl edit) to ýòu͏r borgbackup@gameserver se͞rvi̸cè ̨in ̴o͝rd̵ȩr̀ t̨o͢ a̡dd͜ a Wants=gameserver.target an̶̨̡d̷ ́͏̷a͜n̴̕ ExecPostStart=- in t̀h͝e [Service] s̨ec͝t͡ion͜ ]à̢͠n͏d͟ ̴ève̸ń̢͜t̛͟͏u͠a̸͠ll͘y̸͝ s̷͘o̷͜m̸e ̢m̵o̵̢͞re̴ Requires= and Before= in the [Unit] Į̵̗͈̯̝̝̤͓̜Ţ͏̬̦̠͎̺̲̩͜ͅ ̸̟͚̣̠͔̪̥͎̱̱̬̜̩̱̦͕͎̼͜ͅC̷͏̶͈̭͇̻̤͖̺͟O̶̸̗̠͍͚̠͚͖̳̭͢M̶̨͕̖̙̣̼̫͍͓̩͙͝E̶̲̥͇͍̪͙̦͓̹͈͚̖͞͞S҉̛̼͖̲̠̦͉͕̘͜͠

Or, you could use a cron job and a bash script

(Seriously, though: borgbackup is awesome)

1 Like

That what I had in mind for the setting the interval which the script could be ran at. However other than what was mentioned in here about the ensuring the game server is stopped by @Mason . I am still not sure on how to address the remaining concerns I have.

Since what’s left is ensuring the compressing is actually done before telling the script to start back up the game server. Then transferring the achieve there after that.

Familiar with rsync? Set up SSH certs and you’re good to go. Something like this should work (untested) –

#!/bin/bash 
TIME=`date +%b-%d-%y`                      # This Command will read the date.
FILENAME=backup-admfactory-$TIME.tar.gz    # The filename including the date.
SRCDIR=/var/www/html                       # Source backup folder.
DESDIR=/backup                             # Destination of backup file.
SERVICENAME=gameserver.service             # Name of the game server service
REMOTEUSER=user                            # User account on remote host
REMOTEHOST=backups.mydomain.me             # Remote host to send backups
REMOTEDIR=/backup/gameserver/              # Remote directory where backups reside

systemctl stop $SERVICENAME # stop the systemd service

while true; do
  running_check=$(systemctl is-active $SERVICENAME)
  if [ "$running_check" == "inactive" ]; then
    break;
  else
    sleep 1 # sleep for a bit before checking again
  fi
done

tar -cpzf $DESDIR/$FILENAME $SRCDIR 

systemctl start $SERVICENAME # start the systemd service back up

rsync --remove-source-files $DESDIR/$FILENAME $REMOTEUSER@$REMOTEHOST:$REMOTEDIR
1 Like

Well I actually meant to suggest to deal with this phase using a polished tool like borgbackup, which can compress, dedup & sync on its own. The learning curve is steep maybe, but there are quite a few wrappers.
If you want to create a simple tarball each time and rest assured it has been transfer’d only once you’re sure it has been gzipp’d and no error occurred in the process, you could simply pick Mason’s script eventually adding set -xeuo pipefail to abort if something is failing in the middle

1 Like