Advertisements
How do you backup your amazon RDS (Amazon Relational Database Service) MySQL server? Did you buy any costly backup managers to do the daily backup task? Here we will discuss how to backup your amazon rds MySQL databases to amazon s3 bucket storage using a bash script scheduled with cron on a daily basis.
Requirements :
Amazon RDS (Amazon Relational Database Service) MySQL instance.
Amazon s3 storage bucket for storing the backups.
Amazon Elastic Cloud (ec2) Linux (centos) instance.
s3cmd command line tools
Before we go to the code we will discuss what our daily backup script does.
1) First it creates a file containing list of existing databases
2) Take the dump each and every databases in the file
3) Creates an compressed archive (tar.gz) of all the database dumps
4) Uploads the archive to the s3 bucket.
5) Deletes the 8 days old backup from the s3 bucket. (Why do we need backups older than one week. Remember we have to pay for the storage.)
6) Sends the admins email notification regarding the status/errors etc.
The script is as follows :
You have to save the script in a file(.sh), set proper executable permissions and install a crontab entry.
---------------------------------------------------
#!/bin/bash---------------------------------------------------
### Creating the file list_of_databases.txt which contains the list of databases.
mysql -u root -pPASSWORD -h RDS_Public_DNS_URL -e "show databases" | awk '{print $1}' > /scripts/logs/list_of_databases.txt
sed -i '1d' /scripts/logs/list_of_databases.txt
mkdir /scripts/RDS_backups-`date +%F`
### Taking backup ###
for i in `cat /scripts/logs/list_of_databases.txt`
do
mysqldump -u root -h RDS_Public_DNS_URL -pPASSWORD --lock-tables=false $i > /scripts/RDS_backups-`date +%F`\/$i.sql 2>> /scripts/logs/rds_backup_log.`date +%F`.txt
if [ $? == 0 ]
then
echo -ne "Backup of $i of RDS server has been completed\n" >> /scripts/logs/rds_backup_log.`date +%F`.txt
else
echo -ne "ALERT: Backup of $i of RDS server has FAILED\n" >> /scripts/logs/rds_backup_log.`date +%F`.txt
fi
done
### command to archive and upload to s3 bucket ###
tar cvzf /scripts/RDS_backups-`date +%F`.tar.gz /scripts/RDS_backups-`date +%F`/ &> /scripts/logs/tar.log
s3cmd put /scripts/RDS_backups-`date +%F`.tar.gz s3://s3_bucket_name/`date +%F`/RDS_backups-`date +%F`.tar.gz &> /scripts/logs/s3cmd.log
if [ $? == 0 ]
then
echo -ne "RDS_backups-`date +%F`.tar.gz has been uploaded to s3_bucket_name\n" >> /scripts/logs/rds_backup_log.`date +%F`.txt
else
echo -ne "Uploading RDS_backups-`date +%F`.tar.gz to s3_bucket_name has FAILED\n" >> /scripts/logs/rds_backup_log.`date +%F`.txt
fi
### Deleting the logs that are older than one week
jan=31
feb=28
march=31
april=30
may=31
june=30
july=31
aug=31
sep=30
oct=31
nov=30
dec=31
yearmonth=`date +%F | cut -f1,2 -d "-"`
current_date=`date +%F | cut -f3 -d "-"`
current_month=`date +%F | cut -f2 -d "-"`
old_month=`echo "$current_month - 01" | bc`
year=`date +%F | cut -f1 -d "-"`
if [ $current_date -lt 9 ]
then
diff=`echo "8 - $current_date" | bc`
case $old_month in
1)
x=`echo "$jan - $diff" | bc`
old_date=$year-0$old_month-$x ;;
2)
x=`echo "$feb - $diff" | bc`
old_date=$year-0$old_month-$x ;;
3)
x=`echo "$march - $diff" | bc`
old_date=$year-0$old_month-$x ;;
4)
x=`echo "$april - $diff" | bc`
old_date=$year-0$old_month-$x ;;
5)
x=`echo "$may - $diff" | bc`
old_date=$year-0$old_month-$x ;;
6)
x=`echo "$june - $diff" | bc`
old_date=$year-0$old_month-$x ;;
7)
x=`echo "$july - $diff" | bc`
old_date=$year-0$old_month-$x ;;
8)
x=`echo "$aug - $diff" | bc`
old_date=$year-0$old_month-$x ;;
9)
x=`echo "$sep - $diff" | bc`
old_date=$year-0$old_month-$x ;;
10)
x=`echo "$oct - $diff" | bc`
old_date=$year-$old_month-$x ;;
11)
x=`echo "$nov - $diff" | bc`
old_date=$year-$old_month-$x ;;
12)
x=`echo "$dec - $diff" | bc`
old_date=$year-$old_month-$x ;;
esac
else
x=`echo "$current_date - 8" | bc`
### suppose the date is 15, 15 -8 will become 7 but we need 07 so adding a 0 in the front
if [ $x -lt 10 ]
then
x=0$x
fi
old_date=$yearmonth-$x
fi
s3cmd del --recursive s3://s3_bucket_name/$old_date 2>> /scripts/logs/s3cmd.log 1> /dev/null
if [ $? == 0 ]
then
echo -ne "Backup of the day $old_date is deleted\n" >> /scripts/logs/rds_backup_log.`date +%F`.txt
else
echo -ne "ALERT: Backup of the day $old_date does not exists\n" >> /scripts/logs/rds_backup_log.`date +%F`.txt
fi
### Notifying the admin ###
mail -s "Daily backup report `date +%F` of RDS server" admin@domain.com < /scripts/logs/rds_backup_log.`date +%F`.txt
### Cleare temporary logs and data ###
> /scripts/logs/list_of_databases.txt
rm -rf /scripts/RDS_backups-`date +%F`
rm -rf /scripts/RDS_backups-`date +%F`.tar.gz
Give the script executable permissions:
#chmod u+x filename.sh
Install the crontab entry :
#crontab -e
30 15 * * * sh /scripts/RDS_backup_script.sh
Thats it.
Recommended Reading
1. Host Your Web Site In The Cloud: Amazon Web Services Made Easy: Amazon EC2 Made Easy2. Programming Amazon Web Services: S3, EC2, SQS, FPS, and SimpleDB
3. Middleware and Cloud Computing: Oracle on Amazon Web Services (AWS), Rackspace Cloud and RightScale (Volume 1)
No comments:
Post a Comment
Be nice. That's all.