I’ve been writing on my self-hosted Ghost blog for some time now. In case you’re wondering, this site is hosted on a Digital Ocean Droplet.
For the most part, I felt like I was doing something inconsequential that only meant much for myself. Today, the site has grown to a size that it’d feel like a hat-flying slap to my face if I were to lose all my content.
If you’re looking for a backup solution for your self-hosted Ghost blog, you’ve come to the right place.
TL;DR: How to automate backup for your self-hosted Ghost blog to cloud storage like Google Drive
Context
Getting started with Ghost is easy. You would typically pick between:
- Ghost (Pro) managed service
- Self-hosted on a VPS or serverless platform like Railway
I’d recommend anyone (especially non-developers) to opt for the managed version.
Yes, it’s relatively more expensive (so is every managed service). However, it’d most likely save you a bunch of headaches (and time) that come along with self-hosting any other sites:
- Backups
- Maintenance
- Downtime recovery
- Security, etc.
In short, you’d sleep better at night.
On top of that, 100% of the revenue goes to funding the development of the open source project itself — a win-win.
“Uh, why are you self-hosting Ghost then?”
- Price — nothing beats the price affordability of hosting on your own dedicated server
- Knowledge gain — I’ve learned a lot from hosting and managing my own VPS
Other perks of self-hosting include customizability, control, privacy, etc. — which are great, albeit not my primary reasons.
Most importantly, all the hassles above of self-hosting came to me as fun.
Until it isn’t, I guess.
The pain of backing up Ghost
Setting up Ghost on Digital Ocean is as easy as a click of a button. Yet, there isn’t any proper in-house solution to back up your Ghost site.
From Ghost’s documentation, you can manually backup your Ghost site through Ghost Admin. Alternatively, you could use the ghost backup
command.
Even so, there was no mention of database backup as of the time of writing this.
Backing up with Bash
Why Bash
Simplicity. On top of that, Bash is great for command line interaction.
What are we backing up
Two things:
- Ghost
content/
— which includes your site/blog content in JSON, member CSV export, themes, images, and some configuration files - MySQL database
Overview
In this article, we’re going to write a simple Bash script that does all the following steps for us.
Assuming that we already have Rclone set up, here’s an overview of what our Bash script should cover:
- Optional: run requirement checks to ensure that the CLIs that we need are installed. E.g.
mysqldump
,rclone
, etc. - Back up the
content/
folder - Back up our MySQL database
- Copy the backup files over to our cloud storage (e.g. Google Drive) using Rclone
- Optional: clean up the generated backup files
Utility functions
Let’s create util.sh
which contains a set of helper functions for our backup script.
Personally, I really like having timestamps printed on my logs, so:
#!/bin/bash
log() {
echo "$(date -u): $1"
}
With this, we can now use log
instead of echo
to print text; with the timestamp using:
$ log 'Hola Jerry!'
Sun Jul 22 03:01:52 UTC 2022: Hola Jerry!
Next, we’ll create a utility function that helps to check if a command is installed:
# util.sh
# ...
check_command_installation() {
if ! command -v $1 &>/dev/null; then
log "$1 is not installed"
exit 0
fi
}
We can use this function in Step 1 to ensure that we have ghost
, mysqldump
, etc. installed before we start our backup process. If the CLI is not installed, we would just log and exit.
The backup script
In this section, we’ll create a backup.sh
file as our main backup Bash script.
To keep our code organized, we break the steps in the overview into individual functions.
Before we begin, we’ll need to declare some variables and source our util.sh
so that we can use the utility functions that we defined earlier:
#!/bin/bash
set -e
source util.sh
GHOST_DIR="/var/www/ghost/"
REMOTE_BACKUP_LOCATION="ghost_backups/"
TIMESTAMP=$(date +%Y_%m_%d_%H%M)
GHOST_CONTENT_BACKUP_FILENAME="ghost_content_$TIMESTAMP.tar.gz"
GHOST_MYSQL_BACKUP_FILENAME="ghost_mysql_$TIMESTAMP.sql.gz"
Step 1: Run checks
- Check if the default
/var/www/ghost
directory exists.ghost
CLI can only be invoked within a folder where Ghost was installed - Check if the required CLIs to run our backup are installed
# backup.sh
# ...
pre_backup_checks() {
if [ ! -d "$GHOST_DIR" ]; then
log "Ghost directory does not exist"
exit 0
fi
log "Running pre-backup checks"
cd $GHOST_DIR
cli=("tar" "gzip" "mysql" "mysqldump" "ghost" "rclone")
for c in "${cli[@]}"; do
check_command_installation "$c"
done
}
Step 2: Backup the content directory
- Compress the
content/
directory into a.gz
file
# backup.sh
# ...
backup_ghost_content() {
log "Dumping Ghost content..."
cd $GHOST_DIR
tar -czf "$GHOST_CONTENT_BACKUP_FILENAME" content/
}
Step 3: Backup MySQL database
- Fetch all the necessary database credentials (username, password, DB name) from the Ghost CLI
- Run a check to ensure that we are able to connect to our MySQL database using the credentials above
- Create a MySQL dump and compress it into a
.gz
****file
# backup.sh
# ...
check_mysql_connection() {
log "Checking MySQL connection..."
if ! mysql -u"$mysql_user" -p"$mysql_password" -e ";" &>/dev/null; then
log "Could not connect to MySQL"
exit 0
fi
log "MySQL connection OK"
}
backup_mysql() {
log "Backing up MySQL database"
cd $GHOST_DIR
mysql_user=$(ghost config get database.connection.user | tail -n1)
mysql_password=$(ghost config get database.connection.password | tail -n1)
mysql_database=$(ghost config get database.connection.database | tail -n1)
check_mysql_connection
log "Dumping MySQL database..."
mysqldump -u"$mysql_user" -p"$mysql_password" "$mysql_database" --no-tablespaces | gzip >"$GHOST_MYSQL_BACKUP_FILENAME"
}
Step 4: Copying the compressed backup files to a cloud storage
# backup.sh
# ...
rclone_to_cloud_storage() {
log "Rclone backup..."
cd $GHOST_DIR
rclone_remote_name="remote"
rclone copy "$GHOST_DIR/$GHOST_CONTENT_BACKUP_FILENAME" "$rclone_remote_name:$REMOTE_BACKUP_LOCATION"
rclone copy "$GHOST_DIR/$GHOST_MYSQL_BACKUP_FILENAME" "$rclone_remote_name:$REMOTE_BACKUP_LOCATION"
}
Step 5: Clean up the backup files
# backup.sh
# ...
clean_up() {
log "Cleaning up old backups..."
cd $GHOST_DIR
rm -r "$GHOST_CONTENT_BACKUP_FILENAME"
rm -r "$GHOST_MYSQL_BACKUP_FILENAME"
}
Finally, we shall invoke all of the functions defined for Steps 1 — 5.
# At the end of the backup.sh
# ...
log "Welcome to Wraith"
pre_backup_checks
backup_ghost_content
backup_mysql
rclone_to_cloud_storage
clean_up
log "Completed backup to $REMOTE_BACKUP_LOCATION"
And… we’re done!
The final code
You may find the code at github.com/ngshiheng/wraith.
To use this project directly:
- SSH into your VPS where you host your Ghost site
- Set up Rclone (important)
- Clone this repository
- Run
./backup.sh
from the wraith directory
Automating Backup with Cron
I despise doing manual maintenance and administrative tasks. Let’s schedule a regular backup for our Ghost site to ease our pain using Crontab:
- Run
crontab -e
- For example, you can run a backup at 5 a.m every Monday with:
# m h dom mon dow command
0 5 * * 1 cd /path/to/backup_script/ && ./backup.sh
Do take timezone into consideration when you set your Cron schedule.
Summary
Regardless of the fact that whether you’re just running a simple personal website or a proper business, having a proper backup is critical.
If a large, well-architected distributed system can go down for days, so can your $5/month Digital Ocean droplet.