Backup Mysql Database to AWS S3 Bucket

It is very important to make regular backups of your data to protect it from loss.In this tutorial we will use a bash script which automates your MySQL database backups on daily,weekly or as per your requirement basis.The Linux s3cmd script can easily be used to back up your MySQL databases.  aws-s3-logo

Create S3 Bucket :

Login to your AWS account and go to the “Services > S3” then click on “Create Bucket”. It will prompt you to provide bucket name whenever you create any S3 bucket always maintain format for bucket name which helps to manage multiple buckets in standard way.

Create IAM User :

Once the bucket is created, we need one user who has complete rights to access the bucket. To create user go to “Services–>IAM–>Users”. 

After click on users give name of user here we are created user name as :

backupuser 

Create a new policy for S3 Bucket and assign to the new user so that the user can Read/Write in the bucket
Click on create Own policy option
Update the below S3 Bucket policy in the Policy Document section
{
"Version": "2012-10-17",
"Statement": [
 {
  "Effect": "Allow",
  "Action": ["s3:ListAllMyBuckets"],
  "Resource": "arn:aws:s3:::*"
},
{
  "Effect": "Allow",
  "Action": [
    "s3:ListBucket",
    "s3:GetBucketLocation"
  ],
  "Resource": "arn:aws:s3:::dptsourcebackup"

},
{
  "Effect": "Allow",
  "Action": [
    "s3:PutObject",
    "s3:GetObject",
    "s3:DeleteObject"
  ],
  "Resource": "arn:aws:s3:::dptsourcebackup/*"


}
]
}

Attach the policy to new user

Download the Access Key and Secret Key of the IAM User and note it.

Once you have done with S3 bucket configuration, follow below provided steps to configure MySQL database backup script on the server.

Step 1 : Install s3cmd Utility 

s3cmd is a command line utility used for creating s3 buckets, uploading, retrieving and managing data to Amazon s3 storage. 

For CentOS/RHEL 6 Server 

$ wget http://s3tools.org/repo/RHEL_6/s3tools.repo

$ yum install install s3cmd

For CentOS/RHEL 7 Server 

$ wget http://ufpr.dl.sourceforge.net/project/s3tools/s3cmd/1.6.1/s3cmd-1.6.1.tar.gz
$ tar xzf s3cmd-1.6.1.tar.gz

$ cd s3cmd-1.6.1
$ sudo python setup.py install

For Ubuntu/Debian 

$ sudo apt-get install s3cmd

Step 2 : Configure s3cmd Environment 

In order to configure s3cmd we would require Access Key and Secret Key of your S3 Amazon account.We have already downloaded these keys earlier.

$ sudo s3cmd --configure

Output :

Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.

Access key and Secret key are your identifiers for Amazon S3
Access Key: xxxxxxxxxxxxxxxxxxxxxx
Secret Key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: xxxxxxxxxx
Path to GPG program [/usr/bin/gpg]:

When using secure HTTPS protocol all communication with Amazon S3
servers is protected from 3rd party eavesdropping. This method is
slower than plain HTTP and can't be used if you're behind a proxy
Use HTTPS protocol [No]: Yes


Test access with supplied credentials? [Y/n] Y
Please wait, attempting to list all buckets...
Success. Your access key and secret key worked fine 🙂

Now verifying that encryption works...
Success. Encryption and decryption worked fine 🙂

Save settings? [y/N] y
Configuration saved to '/root/.s3cfg'

Step3 : Configure Backup Script 

Copy below backup script in your directory “/opt” and give name for file “mysqlbackup.sh”.

#!/bin/bash

# Be pretty
echo -e " "
echo -e " Amazon Web Service S3 Mysql Backup Script "
echo -e " "

# Basic variables
mysqluser="root"
mysqlpass="XXXXXXXX"
bucket="s3://BUCKET_NAME"

# Timestamp (sortable AND readable)
stamp=`date +"%s - %A %d %B %Y @ %H%M"`

# List all the databases and eliminate the default
databases=`mysql -u root -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| " | grep -v "\(Database\|information_schema\|performance_schema\|mysql\|test\)"`

# Feedback
echo -e "Dumping to \e[1;32m$bucket/$stamp/\e[00m"

# Loop the databases
for db in $databases; do

  # Define our filenames
  filename="$db-$stamp.sql.gz"
  tmpfile="/tmp/$filename"
  object="$bucket/$stamp/$filename"

  # Feedback
  echo -e "\e[1;34m$db\e[00m"

  # Dump and zip
  echo -e "  creating \e[0;35m$tmpfile\e[00m"
  mysqldump --single-transaction -u$mysqluser -p$mysqlpass --databases "$db" | gzip -c > "$tmpfile"

  # Upload
  echo -e "  uploading..."
  s3cmd put "$tmpfile" "$object"

  # Delete
  rm -f "$tmpfile"

done;

# Jobs a goodun
echo -e "\e[1;32mJob completed\e[00m"

Change Permission and Test Backup Script 

$ sudo chmod +x mysqlbackup.sh

$ sudo bash -x mysqlbackup.sh

Output 

Step 4 : Configure a Cronjob 

We will schedule our backup script to execute at a particular time.

Below cron execute at 2AM IST

00 02 * * * root /bin/bash /opt/scripts/mysqlbackup.sh  > /tmp/mysqlBackups.log 2>&1

Creating New Bucket :

s3cmd mb s3://dptsource

Uploading file in Bucket :

s3cmd put tech.txt s3://dptsource/

Uploading Directory in Bucket

s3cmd put -r backup s3://dptsource/


List Data of S3 Bucket

s3cmd ls s3://dptsource/

List Data of directory in S3 Bucket

s3cmd ls -r s3://dptsource/

Download Files from Bucket

s3cmd get s3://dptsource/tech.txt

Remove Data of S3 Bucket

s3cmd del s3://dptsource/tech.txt

Remove S3 Bucket :

s3cmd rb s3://dptsource/


Bucket Policy and Backup script can be downloaded from :

https://github.com/dptsource/Backup-to-Amazon-S3.git

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Powered by WordPress.com.

Up ↑

%d bloggers like this: