Automating Backups - Part 1
After having some issues manually messing around with the backend of my install, I started digging into the issue of "hey, it's nice I can manually dump my entire text contents into a .json file but that can't be scheduled in the GUI or dumped via command line" - especially since there still are some gotchas like "running ghost config
without any parameters creates a new config file for the particular env" lurking about.
Also, the one "ghost for beginners" guide I could find on backups assumed an SQLite database, and had you stop, copy that .db file, and restart ghost. Not only doesn't this work for the recommended MySQL style setup, but there are command line utilities for dumping SQLite as well.
So - I wanted to backup the database automatically. I wanted to then replicate that over to another linux-based instance elsewhere.
The first step was to create point-in-time backups of the database. Enter mysqldump
.
Since ghost recommends you create a dedicated user to run ghost that is not used for admin work, "ghostadminuser" below is a placeholder for whatever account, sudo-enabled or not, will be used to access the server over SSH for replication, and not the ghost or root accounts. You don't have to place the backups of the database in a user's home folder, but, again, all folders you are replicating later on have to be accessible to the user account you are using for ssh access.
The script below:
- Defines a directory to put your backup database dump into
- Defines your retention period in days
- Defines the name/path for the logfile with basic stop-start info we will generate
- Generates a path and folder name structure from the current date
- creates said backup directory
- sets a path / file name for the dump file (note - in this case it will be a file called ghost_prod.sqldump in a date-named directory)
- runs the mysqldump command
- the user and password can be dug out of the .json file at the root of your ghost directory
- you may have to do a bit of basic work in the mysql/etc. command line to verify the database name though that too should be in the config
- note, this should not be the root database user account, just one with access to the ghost db.
- ALSO - as I alluded above, there are similar commands for other database engines so you don't have to shut down the blog to dump the DB.
- echoes out a confirmation the job was done
- deletes files and directories based on days from now.
#!/bin/bash
# Change this to the path of the folder you want to back up to accessible to the user under which it will run
# (root, or your usual shell user, forex)
BACKUP_DIR=/home/ghostadminuser/ghost_db_backups/
# # of days of backups to retain
BACKUP_RETENTION_PERIOD=16
# Path to your logfile that will be updated as the job is run
LOG_FILE=/home/ghostadminuser/ghost_db_backups/backup-ghost.log
# leave this alone unless you want to change the date formatting, but
# it builds the path and final folder name so changing this can break pruning.
DATE=`date '+%Y/%m/%Y-%m-%d-%H-%S'`
# Make backup directory
mkdir -p $BACKUP_DIR$DATE
DB_FILE_PATHNAME="$BACKUP_DIR$DATE"/ghost_prod.sqldump
# Copy Ghost Database
# update this for the appropriate DB user and password and database names
mysqldump --user ghost-example-user --password="areyoukiddingme?" ghost_prod > $DB_FILE_PATHNAME
echo "Ghost DB has been dumped - $DATE" >> $LOG_FILE
echo "pathname is - $DB_FILE_PATHNAME" >> $LOG_FILE
# Prune backup directory
find "$BACKUP_DIR" -type f -mtime $BACKUP_RETENTION_PERIOD -iname '*.sqldump' -delete
find "$BACKUP_DIR" -type d -mtime +$BACKUP_RETENTION_PERIOD -delete
echo "Backup directory pruned - $DATE" >> $LOG_FILE
echo " "
With that in hand, as well as the path to your ghost/content/images and ghost/content/themes/yourtheme, you're set for the next part, extracting it to another computer.