Backing Up Simple Websites with Git

I’m hosting three different WordPress blogs for different people (this one included) and recently decided to switch away from my home server to a VPS. My home server has a RAID-Z array and important data is backed up remotely on a regular basis giving me the peace of mind that my data is safe, but I can’t be so sure about the VPS. Also, I might switch to a different VPS provider in the future and wanted to make deploying the blogs as easy as possible. I came up with the following solution.


  • Daily backups usable for small websites with little traffic
  • Backups should be complete: all files, MySQL dump, logs
  • Flexibility

Why Git?

Using git has many inherent advantages over simply copying files to a remote server.

  • Daily backups will only store diffs, taking up little space
  • Using a VCS allows you to see history, merge fixes, branch, etc. if you need to (eg. for developing plugins/themes on one computer and easily merging them into your website)
  • Push backups to a remote server using http, https, ssh, or git protocols
  • Easily exclude files from backups using .gitignore
  • Setting up a new server is just a git clone away
  • Know your backup succeeded just by browsing your git repository
  • More!

So, why should we store the database backup and log files in git as well?

  • Complete history of your website – restore the files AND database to any state you want
  • The diffs in each commit give you the traffic for the day and which rows were added to your database
  • No need for separate backup methods!

Of course, git isn’t the perfect solution for all websites. High-traffic websites may find their git repositories quickly grow in size when storing database backups and log files.


To make things easier, I structured each of the sites in the following fashion:

   |- logs/
   |  `-
   |  `-
   |- root/ # WordPress installation
   `- sql/
      `- # MySQL dump

Also, the MySQL username and database name for each site is the same as the URL, so “” for this case.

Prepare the Remote Repository on the Backup Server

Create an empty repository on the backup server.

$ cd /path/to/git/repositories
$ mkdir
$ cd
$ git init --bare --shared

Add data to the repository

We can clone the empty repository we just made and add the initial data to it. Do this on the server your site is currently hosted on.

$ git clone
$ cd
$ mkdir logs sql
$ touch logs/ logs/
$ mysqldump -pPASSWORD --skip-extended-insert --skip-comments > sql/
# Copy the WordPress install
$ cp -r /path/to/site/root root

I also made a simple Makefile to automate backing up. Edit it to suit your needs (especially the MYSQL_ variables) and place it in the root of your repository.

SITE_NAME=  $(notdir $(CURDIR))

.PHONY: backup
backup: sql_backup git_commit git_push
  @echo Backup of $(SITE_NAME) finished successfully!

.PHONY: sql_backup
  @echo Backing up the database
  @mysqldump -u$(MYSQL_USER) -p$(MYSQL_PASS) --skip-extended-insert --skip-comments $(MYSQL_DB) > sql/$(SITE_NAME).sql

.PHONY: git_commit
git_commit: git_commit_sql git_commit_logs git_commit_root

.PHONY: git_commit_logs
  @echo Committing log files
  @-git commit logs -m "Updating logs"

.PHONY: git_commit_root
  @echo Committing root
  @-git add root
  @-git commit root -m "Updating root"

.PHONY: git_commit_sql
  @echo Committing sql
  @-git commit sql -m "Updating sql dump"

.PHONY: git_push
  @echo Pushing commit
  @-git push

Now we can commit changes and push to our backup server. Remember that you can add a .gitignore to keep files or directories from being backed up.

$ git add .
$ git commit -m "initial commit"
$ git push origin master


Now when you want to deploy your site, set up the MySQL user and database then checkout the site from git.

$ cd /usr/local/www
$ git clone --shared
$ cd
$ mysql -pPASSWORD < sql/

As long as your web server points to /usr/local/www/, you should now be able to access your site!

If you want to test backups, just generate some log activity or do something which would update the database and run make backup from /usr/local/www/

Automating Backups

You can automate backups by using cron.

Access your crontab

$ crontab -e

…and add the following line:

@daily cd /usr/local/www/ && make backup &> /dev/null

If all goes well, you should see daily commits popping up on your backup server! If you ever lose data, you can go back to the deploying section and go back to your last backup.