Writing a bash shell script for website backup

Posted on

Problem :

I am about to write my first shell script for backing up my server.

These are the steps I have identified so far.

  1. Copy files to be backed up to /srv/backup/
  2. run mysqldump and copy the file to srv/backup/databases
  3. run duplicity to backup /srv/backup/* to another folder on my machine

I am writing a bash shell scrip that will be run everyday, and will carry out the three tasks mentioned above.

Note: point 3 (backing up to a local folder) is only a temporary measure – to allow me to understand what I’m doing, since all the tools I am using, are new to me. Once I can backup and restore correctly, I will use duplicity to compress and encrypt the files and upload them offsite.

If my understanding of duplicity is correct (according to the documentation here), the first time I run the script, a FULL backup will be done. Every subsequent backup will then be incremental. I will then force a FULL back on say a weekend.

First things first though – I have a few questions:

  1. I would like to use backup rotation for the ‘scheme’ described above – I would like some recommendations on what kind/type of rotation to use.

  2. Once I have implemented the backup rotation, how can I restore from a particular day back in time (assuming the backup exists of course). ?

I am running Ubuntu 10.0.4

Solution :

A simple solution is to use ‘tar‘ to do your daily backups. And i suggest you to make Full backups every day, because a website backup is normally not a big job (few minutes for a full backup). Anyway, for your database (.sql file), you just don’t have the choice to do a full backup of it.

tar cvf /srv/backup/backup.tar /website/directory

Leave a Reply

Your email address will not be published. Required fields are marked *