(Last Updated On: March 1, 2018)

In this guide, I’ll take you through the steps to Backup MySQL databases to Amazon S3 on Ubuntu and CentOS based MySQL servers. Amazon S3 is a web service offered by Amazon Web Services. It provides storage through web services interfaces and provides APIs for you to objects stored on it.

Since S3 is a highly available distributed object storage service, It is an ideal place to store critical data like databases which don’t consume large space. Before you can interact with S3 over an API, you need to install awscli tool which provides aws command.

Installing awscli on CentOS, Ubuntu, and MacOS

The only requirement is Python 2 version 2.6.5+ or Python 3 version 3.3+To install awscli on CentOS 7, run the commands:

$ sudo yum install python-pip # For CentOS
$ sudo apt-get python-pip # For Debian and Ubuntu
$ brew install python3 &&  curl -O https://bootstrap.pypa.io/get-pip.py \
 && python3 get-pip.py --user # For MacOS

Then install awscli from pip

$ sudo pip install awscli
$ pip3 install awscli --upgrade --user # For python3

Verify that the AWS CLI installed correctly by running aws –version.

$ aws --version
aws-cli/1.14.45 Python/3.6.4 Darwin/17.4.0 botocore/1.8.49

The AWS CLI is updated regularly to add support for new services and commands. To update to the latest version of the AWS CLI, run the installation command again.

$ pip install awscli --upgrade --user

If you need to uninstall the AWS CLI, use pip uninstall.

$ pip uninstall awscli

Configure awscli

Configure AWS CLI tool with your user’s credentials by running:

# aws configure

AWS Access Key ID [None]: 
AWS Secret Access Key [None]:
Default region name [None]:
Default output format [None]:

Provide at least Access Key ID and Secret Access Key. This will create a folder with ~/.aws with your credentials.

$ ls .aws
config credentials

Dump MySQL databases

Now it’s time to backup your MySQL databases. We first need to export some variables used to dump MySQL databases.

export DB_USER="root"
export DB_PASSWORD=""
export DB_HOST="localhost"
export date_format=`date +%a`
export db_dir="/tmp/databases/$date_format"
export dest_backup_file="/tmp/mysql-databases-$date_format.tgz"
export s3_bucket="s3://bucket-name"
mkdir -p $db_dir 2>/dev/null

Create db_dir if it doesn’t exist already:

if [ ! -d $db_dir ]; then
   mkdir -p $db_dir
fi

The next thing to do is dump the databases to file.

# Get all MySQL databases

databases=`mysql -u $USER -h $HOST -p$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`

# Dump all databases

for db in $databases; do
 if [[ "$db" != "information_schema" ]] && [[ "$db" != "performance_schema" ]] && [[ "$db" != "mysql" ]] && [[ "$db" != _* ]] ; then
 echo "Dumping database: $db"
 mysqldump -u $USER -h $HOST -p$PASSWORD --databases $db > $db_dir/$db-$date_format.sql
 fi
done

All your databases excluding mysql and performance_schema will be exported to specified db_dir value.

Backup MySQL databases to S3

You have all the databases exported and you should be ready to back them up to s3. It is a good practice to compress them before syncing to s3.

# tar -zcvf $dest_backup_file -C $db_dir .
# aws s3 cp $dest_backup_file ${s3_bucket}

This will upload the compressed file to AWS S3. Once it is done, the check the contents using:

# aws s3 ls ${s3_bucket}

Backup MySQL databases to S3 – The scripted way

A script which can be used to automate this process is on the link below:

https://github.com/jmutai/mysql-backup-s3

You can also read these articles available on our site:

How to install and configure DokuWiki on CentOS 7 with PHP 7.1

Install Dokuwiki behind nginx and letsencrypt on Linux