This site is hosted on a Digitalocean droplet that runs several docker containers. One of them is Postgres, and this single container is storing all kinds of data I use in my personal projects. If you for instance make a reaction or comment to this post, itā€™s stored in my Postgres database container.

This data is precious to me, and I therefore wish to create backups of it incase something where to happen.

Here is the complete script file:

#!/bin/bash

CONTAINER_ID="ca79147af925"
DB_USER="axel"
BACKUP_DIR="/home/axel/db_dumps/"
BACKUP_FILE=dump_`date +%Y-%m-%d"_"%H_%M_%S`.sql.gz

echo $(date -u) "Deleting backups older than 30 days inside dir $BACKUP_DIR"
find $BACKUP_DIR -type f -name "*.gz" -mtime +30 -delete
echo $(date -u) "Old backups successfully deleted"
echo $(date -u) "Starting backup of DB"
docker exec -t $CONTAINER_ID pg_dumpall -c -U $DB_USER | gzip > $BACKUP_DIR$BACKUP_FILE
echo $(date -u) "DB backup finished"
  • ca79147af925 is the container ID of the Postgres container. Replace this with your container id. You can find it by running ā€œdocker psā€ if the container is up.
  • axel is the user name of the database user.
  • All my files are inside /home/axel and /home/axel/db_dumps. Replace these paths with your desired paths.

This script initiates by removing backups in ā€˜/home/axel/db_dumpsā€™ older than 30 days using the ā€˜findā€™ command with ā€˜-deleteā€™ option. We do this to save disk space. We also only delete files ending with .gz.

It then begins a database backup using Docker. The ā€˜docker execā€™ command runs ā€˜pg_dumpallā€™ to dump the database content of all databases into standard output. The output is compressed using ā€˜gzipā€™ and saved with a timestamped filename in ā€˜/home/axel/db_dumpsā€™.

Finally, it prints a timestamped message indicating the completion of the database backup process.

Scheduling the backup script to run every day

I use a cron job for this; simply run:

sudo crontab -e

And add this line to your crontab:

00 04 * * * sh /home/axel/backup.sh >> /home/axel/db-backup-log.txt 2>&1

āš ļø Change the paths to where you placed the script, and redirect the output to your desired log file.

This will run the backup script every morning at 4 am. Check this tool if you wish to run it at another time or multiple times a day. The output of the bash script is redirected to a text file that can be viewed for logging. Any errors are also redirected to the file (2>&1).

Getting the backups off the server

Simply backing up your data is not enough, you need to store the backups somewhere safe. I wont be going through it in this post. Let me know in the comment section if you want ideas on how to do this.