Automated MySQL backup on dedicated server or VPS

by Mateusz Tymek — on Linux

Head's up! This post was written back in 2013 and is very likely to contain outdated information.

I just moved all my small projects to new dedicated server. I have to admit, until now I wasn't paying attention to regular backups. I simply ran mysqldump and copied everything to my laptop every few months. I didn't have any problems with that, as my data was not very critical. But, this time I decided to build something better - I wanted database backups to be generated automatically, at regular intervals.

I knew more or less what to do, I just had to put all pieces together. This tutorial shows necessary steps to build similar solution on your server.

Start with creating new mysql user, with read-only access to everything:

[email protected]:~$ mysql
mysql> GRANT SELECT, LOCK TABLES ON *.* TO 'mysqlbackup'@'localhost' IDENTIFIED BY 'SOME_SECRET_PASSWORD';
mysql> FLUSH PRIVILEGES;
mysql> exit

Then create new system user, and switch account:

[email protected]:~$ useradd -m mysqlbackup
[email protected]:~$ sudo su - mysqlbackup

Create ~/.my.cnf file, and put following lines inside:

[client]
user=mysqlbackup
password=SOME_SECRET_PASSWORD

Make sure it won't be accessible by other users:

[email protected]:~$ chmod 600 ~/.my.cnf

Now it is time to write bash script that will generate backups and store them (compressed) in a directory.
I'd like to have every database kept in separate file, for convenience. It turned out that it is easy to grab list of all databases with this command:

[email protected]:~$ mysql --skip-column-names -e 'select Db from mysql.db'

Once you have it, you can write bash script that will run mysqldump for all available databases, gzip them and store in directories like backups/DATE_AND_TIME. Here's what I wrote:

#!/bin/bash

dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
databases=($(mysql --skip-column-names -e 'select Db from mysql.db'))
now=$(date +"%Y-%m-%d %T")
backup_path=$dir/backups/$now

mkdir -p "$backup_path"

for database in "${databases[@]}"
do
    mysqldump  --disable-keys $database | gzip > "$backup_path/$database.sql.gz"
done

Put above code in backup.sh file, make it executable and run:

[email protected]:~$ chmod +x backup.sh
[email protected]:~$ ./backup.sh

If you did everything correctly, you should be able to list backup files with find command:

[email protected]:~$ find backups
backups
backups/2013-01-28 23:47:42
backups/2013-01-28 23:47:42/database1.sql.gz
backups/2013-01-28 23:47:42/database2.sql.gz
backups/2013-01-28 23:47:42/another_database.sql.gz

Once you are sure everything works, you can add this script to crontab. Let's say you want to generate backups every day, at 12:03 am. Type crontab -e, and put following line in editor:

03      0       *       *       *       /home/mysqlbackup/backup.sh

That's it. This backups may save your life if random DROP TABLE hits the database, but won't help if server's HDD dies. Even though unwanted queries tend to happen more often, I suggest copying GZIP files to different location on regular basis.


comments powered by Disqus