Regular Backups to Amazon S3

Although, many of us prefer to adhere to the statement that “it’ll never happen to me”, this type of thinking is becoming increasingly risky. we offers a few backup options and this is just additional option you can use to protect your data.

There is pretty simple bash script to back up sites and their databases. It runs hourly, and stores the hourly backups locally (to cut down on bandwidth and such). Each morning at 3AM, it uploads a daily backup to S3, and deletes local backups more than a day old. Each week it deletes old backups on the S3, so you have dailies for a certain time, weeklies for a certain time, and monthlies for six months back. Probably a bit excessive.

Anyhow, here’s how it works. All sites are kept in /var/www, and backups are safely stored in /var/www/backup — so these instructions will reflect that.

Make a backup directory and download the S3 ruby scripts:

# mkdir /var/www/backup
 
# cd /var/www/backup
 
# wget http://s3.amazonaws.com/ServEdge_pub/s3sync/s3sync.tar.gz
 
# tar -xzf s3sync.tar.gz
 
# rm -rf s3sync.tar.gz
 
# cd s3sync

You need some ssl certs for this to work:

# mkdir certs
 
# cd certs
 
# wget http://mirbsd.mirsolutions.de/cvs.cgi/~checkout~/src/etc/ssl.certs.shar
 
# sh ssl.certs.shar
 
# cd ..

Make the backup script (copy the code at the end of this post):

# vi s3back.sh
 
# chmod +x s3back.sh

Set up an hourly cron job

# crontab -e
 
0 * * * * /var/www/backup/s3back.sh >/dev/null 2>&1

And here’s the script. Feel free to modify it as much as you desire.

#!/bin/bash
#
# NATE'S S3  BACKUP SCRIPT
#
# This script is designed to backup a selection of websites and
# their relevant databases. It creates hourly local backups, and
# at 3 a.m. writes a daily backup to an Amazon S3 account. At
# that time, it deletes older backups on the S3 server, retaining
# daily backups for a week, weekly backups for a month, and monthly
# backups for six months.
#
# VARIABLES
#
# S3 VARIBALES
#
# If you have a file at /etc/s3conf/s3config.yml that contains
# these three variables, then leave these lines commented out.
# Otherwise, uncomment them, and add your relevant information.
#
#export AWS_ACCESS_KEY_ID=ACCESS_KEY_HERE
#export AWS_SECRET_ACCESS_KEY=SECRET_KEY_HERE
#export SSL_CERT_DIR=/path/to/certs
#
# SERVER VARIABLES
#
# Below are some environmental variables. WWWDIR is the path to
# all of your websites. S3DIR is the path to your s3sync
# ruby scripts. BACKUPDIR is the directory where you'd like to store
# your temporary backup files as you're uploading to S3. Files in
# BACKUPDIR will be deleted during the backup process. BACKUPNAME is
# the 'prefix' of the filename that will be uploaded to S3. If you
# leave the default, the filename will be something like:
# backup_sitename_Aug_12_2009_12.45.tar.gz.
S3DIR="/var/www/backup/s3sync/"
BACKUPDIR="/var/www/backup/"
WWWDIR="/var/www/"
BACKUPPRE="backup_"
# WEBSITE VARIABLES
#
# Each website directory needs the following variables. For each
# site, increment the number. The SITES variable is not the
# domain of the site, but the name of the folder in which it
# resides. BUCKETS is the name of the Amazon S3 bucket where you
# want to store it. DBNAMES can take one database name, more than
# one database name, or can also be empty. E.g., if I have two
# directories within my BACKUPDIR, my variables might look like this:
#
# SITES[0]="example1.com"
# DBANMES0=(example1)
# BUCKETS[0]="example1"
#
# SITES[1]="example2.com"
# DBANMES1=(example2 dev_example2)
# BUCKETS[1]="example2"
SITES[0]=
DBNAMES0=
BUCKETS[0]=
# DATABASE VARIABLES
#
# Currently the script only supports one database user. This means
# you either need to put your MySQL root user info below, or -- if
# you're not comfortable with that -- the info for a 'global' user
# who has all privileges on all your database (that are being backed
# up). If you need to create a new user, do this:
#
# >>> mysql -u root -p
# >>> [type your root password]
# mysql> GRANT ALL ON *.* TO 'dbusername'@'localhost' IDENTIFIED BY 'dbuserpass';
# mysql> FLUSH PRIVILEGES
# mysql> exit
DBUSER=dbusername
DBPWD=dbuserpass
# Don't change these variables.
NOW=$(date +%b_%d_%Y_%H.%M)
NOWUNI=$(date +%s)
NOWHOUR=$(date +%H)
NOWDAY=$(date +%w)
NOWWEEK=$(date +%U)
BACKUPEXT=".tar.gz"
# Begin script
let "a = 0"
for (( i = 0 ; i < ${#SITES[@]} ; i++ )); do
    SITE=${SITES[$a]}
    BACKUPNAME=$BACKUPPRE$SITE"_"
    EXCLUDES=$EXCLUDES$a
    cd $WWWDIR$SITE
    tar -czf "httpdocs_"$NOW$BACKUPEXT *
    mv "httpdocs_"$NOW$BACKUPEXT $BACKUPDIR"httpdocs_"$NOW$BACKUPEXT
    cd $BACKUPDIR
    DBNAMES="DBNAMES"$a[*]
    DBNAMES=${!DBNAMES}
    DBUSERS="DBUSERS"$a[*]
    DBUSERS=${!DBUSERS}
    DBPWDS="DBPWDS"$a[*]
    DBPWDS=${!DBPWDS}
    let "k = 0"
    for dbname in $DBNAMES; do
         DBNAME[$k]=$dbname; let "k += 1"
    done
    let "k = 0"
    for dbuser in $DBUSERS; do
         DBUSER[$k]=$dbuser; let "k += 1"
    done
    let "k = 0"
    for dbpwd in $DBPWDS; do
         DBPWD[$k]=$dbpwd; let "k += 1"
    done
    for (( k = 0 ; k < ${#DBNAME[*]} ; k++ )); do
        touch db_${DBNAME[$k]}_$NOW.sql.gz
        mysqldump -u $DBUSER -p$DBPWD ${DBNAME[$k]} | gzip -9 > db_${DBNAME[$k]}_$NOW.sql.gz
        BACKEDDBS[$k]=db_${DBNAME[$k]}_$NOW.sql.gz
    done
    BACKEDDB=${BACKEDDBS[@]}
    tar -czf $BACKUPNAME$NOW$BACKUPEXT "httpdocs_"$NOW$BACKUPEXT $BACKEDDB
    rm -rf "httpdocs_"$NOW$BACKUPEXT $BACKEDDB
    if [[ $NOWHOUR -eq 03 ]]; then
        cp $BACKUPNAME$NOW$BACKUPEXT tmp/$BACKUPNAME$NOW$BACKUPEXT
        DAY=$((60*60*24))
        WEEK=$(($DAY*7))
        DAILY=$(($WEEK*2))
        WEEKLY=$(($WEEK*6))
        MONTHLY=$(($WEEK*24))
        cd $S3DIR
        BUCKET=${BUCKETS[$i]}
        ruby s3sync.rb -r --ssl ${BACKUPDIR}tmp/ ${BUCKET}:
        cd $BACKUPDIR/tmp
        rm -rf *
        cd $BACKUPDIR
        for bfile in $(ls); do
            if [[ $(echo ${bfile} | grep ".*.tar.gz$") ]]; then
                   if [[ $(echo ${bfile} | grep "^${BACKUPNAME}") ]]; then
          DATE=$(echo ${bfile} | sed "s/${BACKUPNAME}//;s/${BACKUPEXT}//;s/_/ /g;s/[.]/:/")
                          BACKUNI=$(date --date="$DATE" +%s)
                          BACKDIFF=$(($NOWUNI-$BACKUNI))
                          if [[ $BACKDIFF -gt $DAY ]]; then
                                 rm -f ${bfile}
                          fi
                  fi
           fi
           done
           cd $S3DIR
           FILES=$(ruby s3cmd.rb list $BUCKET)
           for file in $FILES; do
           REGEX=$(echo ${file} | grep "^-*$")
           if [[ ! $REGEX ]]; then
              DATE=$(echo ${file} | sed "s/${BACKUPNAME}//;s/$BACKUPEXT//;s/_/ /g;s/[.]/:/")
                   DATEUNI=$(date --date="$DATE" +%s)
                   DATEDAY=$(date --date="$DATE" +%w)
                   DATEINMO=$((10#$(date --date="$DATE" +%d)))
                   DATEWEEK=$(date --date="$DATE" +%U)
                   WEEKMOD=$(($DATEWEEK%4))
                   DIFF=$(($NOWUNI-$DATEUNI))
                   if [[ $DIFF -gt $MONTHLY ]]; then
                         ruby s3cmd.rb delete $BUCKET:${file}
                   elif [[ $DIFF -gt $WEEKLY && $DATEINMO -gt 8 ]]; then
                         ruby s3cmd.rb delete $BUCKET:${file}
                   elif [[ $DIFF -gt $DAILY && $DATEDAY -ne 0 ]]; then
                         ruby s3cmd.rb delete $BUCKET:${file}
                     fi
               fi
       done
       fi
       let "a += 1"
   done
 
exit 0
 

Was this answer helpful?

 Print this Article

Also Read

How to install firewall using ConfigServer Firewall (CSF) on CentOS cPanel server

ConfigServer firewall is a popular linux firewall security suite. It is easy to install, flexible...

Add Secondary IP Address to Centos

Just create a new interface file with a colon and a number....

VNC ( Virtual Network Computing )

VNC is used to display an X windows session running on another computer. Unlike a remote X...

How to change my ssh port

IMPORTANT: Beofre changing your port, make sure the new port is whitelisted in your firewall, if...

Setting up IPtables

1. Introduction CentOS has an extremely powerful firewall built in, commonly referred to as...