Monday, May 21, 2012

Confiugring daily backup of amazon centos instance

Advertisements

Configuring daily backup of amazon linux centos instance to amazon s3 storage

How do you backup your amazon ec2 (Amazon elastic cloud) Linux  instance? Did you buy any costly backup sofwares to do the daily backup task? Here we will discuss how to backup your amazon ec2 Linux  instance to amazon s3 bucket storage using a bash script scheduled with cron on a daily basis.



Requirements :
Amazon ec2 (Amazon elastic cloud) Linux instance.
Amazon s3 storage bucket for storing the backups.
s3cmd command line tools

Before we go to the code we will discuss what our daily backup script does.
Script takes the backup of  following directories and databases
1) /usr/share/tomcat5/webapps/company
Databases
1) company
All the log files will be save in /scripts/logs directory

------------------------------------------------

#!/bin/bash
### Creating the daily backup directory
mkdir /scripts/server_name-`date +%F`
cd /scripts/server_name-`date +%F`
### Taking the backup of directories
tar -cpvzf company.tar.gz /usr/share/tomcat5/webapps/company 2>> /scripts/logs/tar.log 1> /dev/null
if [ $? == 0 ]
then
echo -ne "company.tar.gz is created\n" >> /scripts/logs/production_daily`date +%F`.log
else
echo -ne "ALERT: company.tar.gz creation has FAILED\n" >> /scripts/logs/production_daily`date +%F`.log
fi
### Uploading the directory backups to s3 bucket "bucket_name"
s3cmd put company.tar.gz s3://bucket_name/`date +%F`/company.tar.gz 2>> /scripts/logs/s3cmd.log 1> /dev/null
if [ $? == 0 ]
then
echo -ne "company.tar.gz is uploaded to amazon s3\n" >> /scripts/logs/production_daily`date +%F`.log
else
echo -ne "ALERT: company.tar.gz uploading to amazon s3 has FAILED\n" >> /scripts/logs/production_daily`date +%F`.log
fi
### Taking backup of databases
mysqldump company > company.`date +%F`.sql 2>> /scripts/logs/mysql.log
if [ $? == 0 ]
then
echo -ne "company.`date +%F`.sql is created\n" >> /scripts/logs/production_daily`date +%F`.log
else
echo -ne "ALERT: company.`date +%F`.sql creation has FAILED\n" >> /scripts/logs/production_daily`date +%F`.log
fi
### Uploading the database backups to s3 bucket bucket_name
s3cmd put company.`date +%F`.sql s3://bucket_name/`date +%F`/company.`date +%F`.sql  2>> /scripts/logs/s3cmd.log 1> /dev/null
if [ $? == 0 ]
then
echo -ne "company.`date +%F`.sql is uploaded to amazon s3\n" >> /scripts/logs/production_daily`date +%F`.log
else
echo -ne "ALERT: company.`date +%F`.sql uploading to amazon s3 has FAILED\n" >> /scripts/logs/production_daily`date +%F`.log
fi
### Mailing the status
mail -s "Daily backup report on `date +%F` of Production server" admin@domain.com < /scripts/logs/production_daily`date +%F`.log
### Clearing the logs and temporary data
cd ~
rm -rf /scripts/server_name-production-`date +%F`
------------------------------------------------
Give the script executable permissions:
#chmod u+x  ec2_backup_script.sh

Install the crontab entry :
#crontab -e
30 15 * * * sh /scripts/ec2_backup_script.sh


Thats it.

Recommended Reading

1. Host Your Web Site In The Cloud: Amazon Web Services Made Easy: Amazon EC2 Made Easy
2. Programming Amazon Web Services: S3, EC2, SQS, FPS, and SimpleDB
3. Middleware and Cloud Computing: Oracle on Amazon Web Services (AWS), Rackspace Cloud and RightScale (Volume 1)

No comments:

Post a Comment

Be nice. That's all.