Live SQlite3+files Backup Script Issues

Firstly here is the script that I am running into issues for my custom game server.

#!/bin/sh
#backup sqlites
for db in map players; do
sqlite3 $db.sqlite “.timeout 1000” “.backup $db-backup_date '+%m->%d-%Y'.sqlite”
done
#moves backuped sqlites
mv -backup_.sqlite /home/user/backups
#backup the file based contents
tar czf /home/user/backups/world_date '+%m-%d-%Y'.tar.gz -->exclude=‘.sqlite’ *

This is located where it should be and both manual and Cron Job operations creates a few issues…

  1. Should the game server be active (exploring in map) the backups get totally screwed up since for some reason the .backup command doesn’t cope with this well??
  2. File sizes for the databases are just a few kbs when they are as much as 3xxMB in size currently and the other still being xxkbs in size!
  3. Tar.gz also backs itself up (so it does ~/backup/own.tar.gz instead of the files in the bash’s current directory).
  4. I get “0 bytes” sqlites in a parent directory as well.

The only “error” I was able to get out of running it manually while auto walking in the server was

“tar: map.sqlite-journal: file changed as we read it”

Even though as shown above I “excluded” it and other DBs using the wildcards from the tarring processing. As the Sqlite3 .backup commands were supposed to deal with the data base backing up instead as commented. Honestly I am not sure WHY this is the case when the game server is active enough to flip it out of the control.

Anyone able to help me troubleshoot my backup script to not to be so messed up like this?

Why did you add a timeout?

Also, to my knowledge a backup is aborted if another application requests a write lock to the database and then a retry is automatically performed (which takes time). I’m on my phone currently so it’s a bit hard to read through the docs, but it should all be in there. 99% of all SQLite issues are related to locks.

Thanks you for taking your time to reply here. :slight_smile:

I was attempting to follow the poster a few posts below Daily Worlds Backups Script - Minetest Forums

I see so basically what may be happening here is that…

  1. Backup script is started and tries to process
  2. Fails because of mentioned lock request and while trying to “retry” tar takes over thus that obviously failing too and therefore the script ends
    ?
    If that is the case, then how can we ensure we ensure that we get an “still good” backup for the associated non databases files when .backup does eventually able to scoop in and take a backup?

Thanks in advance for any further help.

1 Like

What you’re saying shouldn’t be possible since the sqlite process ended. Retries should be done in the same process if I’m not mistaken. But sqlite has always been a mess… so yeah I wouldn’t say it’s impossible.

I’m more thinking the timeout causes the backup to abort which in turn leads to other strange things. Another possible explanation is that you’re trying to backup the same database with the same script more than once at the same time. I’m on my phone so my reply is a bit short, but try and run the plain backup command in parallel to your current backup task (use a different filename and preferably a different time).

sqlite3 my_database.sq3 ".backup 'backup_file.sq3'"

Thanks for trying to help me understand so is what your saying is try doing with ommitting the timeout portion…

DB1
then DB2
Finally tarring the non DB files as usual in the script?

Thanks in advance for helping me understand.

Yeah that’s exactly what I’m saying :slight_smile: . You might want to schedule that a few hours after your current backup so they don’t interfere.

1 Like

That’s may to be the issue I been experiencing but I am not 100% sure as I didn’t do it while the game server was ACTIVELY in use. A simple wait $BACK_PID appeared to make it work at least while the game server is idling.

The only possible issue from here would be…

and if that’s the case then the script could actually breaks. But I really needs to go to bed so I don’t have time to auto forward in game to find new lands to force the game to be “under active load” sqlite wise.

1 Like

I really do not get why it’s failing on the cron job tasks even though I gave it way more time than needed per cycle (an hour when the script only need literal seconds to process and there been no active players either that would “delay” this).

As the script would instead do the weirdness it been doing from the start. WHAT IS GOING ON HERE??

What happens if you manually perform a backup?

Everything works as intended with no outputs to the command line after waiting for the script to process (about 5 or fewer seconds I noticed). So the dbs are actually their respected sizes so is the tar.gz and what’s should be backed up gets backed up. Plus there aren’t empty db files in the /home/user directory either.

Try piping the cron output to a file and inspect the log.

Add >> /tmp/backup.log 2>&1 to the end of your cron command and check it out after it runs. Turn on any verbose flags as well the script if that’ll help pinpoint the issue.

I tried and as I expected it didn’t output much to work off of since I am not sure if you can flag a SQlite3 command to verbose as well.

So here what the script outputted and I am assuming this is from just the tar command…

backups/
backups/world_04-11-25-2019.tar.gz
backups/world_11-11-25-2019.tar.gz

Again I don’t know WHY it doing that given the tar command is as follows on the scripting and is the LAST to run on the entire script as presented in the OP…

tar czvf /home/user/backups/world_date '+%I-%m-%d-%Y'.tar.gz --exclude=‘.sqlite’ *

Yep. Here’s what I think is happening here – it’s trying to put a copy of the backup it’s currently creating (backups/world_11-11-25-2019.tar.gz) into the backup, and is killing itself as a result. Try saving the backup anywhere else on your fs besides that directory (i.e. /home/user/world_ `date '+%I-%m-%d-%Y'`.tar.gz would be fine), then see what happens. Also best to just put the absolute path instead of “*” to make sure it’s pointing to the right directory.

I tried

tar czvf /home/user/world_`date '+%I-%m-%d-%Y'`.tar.gz --exclude='*.sqlite*' /home/user/.minetest/worlds/world 

And nothing happened (zero backups) and no logs output even though I still have that cron left to dump it as you advised me from the beginning.

Learn some self-sufficiency. You have the attitude of a $3/year client.

Of what? As I said there is literally no logs output to go by.

Plus as I said earlier I did attempted to find a script (plus alternative solutions) online for this purpose but obviously fell short.

When I need help, I simply need help. If you don’t like how I am requesting help here, then you are free to leave the thread. As I am not forcing anyone to help me, I am simply requesting it.

If people here have a “problem” with that I will more than happy show myself out as a Hostballer’s member.

I read :

You have the attitude of a 3 year old client.

:smile: