Page 1 of 1

Duply-simplified and/or automated encrypted FTP back-ups

Posted: Fri May 10, 2013 5:51 am
by Admin
Hello !

OK, some of you didnt like the duplicity tutorial as it is not friendly/automated enough.
Therefore, I come back over the so important issue of back-ing up with a Duply tutorial. There, now you cant say it was too complicated :)
So, suppose you have one of our FTP plans, either free or paid, it is irrelevant, it will work just as well, let's say our ftp server is bk02.prometeus.net your username is miaumiau and password AbCdEf. Of course, you will not have such simple passwords and it is not recommended to use them anywhere, but this is for an example only.
The following insructions are for debian 7.0(wheezy).
First, as always, make sure you have everything up to date:

Code: Select all

apt-get update

Code: Select all

apt-get dist-upgrade
Now, get duply:

Code: Select all

apt-get install duply
and ncftp needed for the ftp protocol back-ups:

Code: Select all

apt-get install ncftp
Create a profile:

Code: Select all

duply miaumiau create
where miaumiau is profile name (can choose anything there)
You will be warned to make a copy of your profile and informed where it is stored:
Congratulations. You just created the profile 'miaumiau'.
The initial config file has been created as
'/root/.duply/miaumiau/conf'.
You should now adjust this config file to your needs.

IMPORTANT:
Copy the _whole_ profile folder after the first backup to a safe place.
It contains everything needed to restore your backups. You will need
it if you have to restore the backup from another system (e.g. after a
system crash). Keep access to these files restricted as they contain
_all_ informations (gpg data, ftp data) to access and modify your backups.

Repeat this step after _all_ configuration changes. Some configuration
options are crucial for restoration.
You have been warned, now lets make a back-up of the directory "test" located in /root
For this we need to modify the sample config file located where we have been told above: /root/.duply/miaumiau/conf
If you are comfortable with vi, use that to modify:

Code: Select all

vi /root/.duply/miaumiau/conf
Or, install nano (in some cases it is already installed)

Code: Select all

apt-get install nano
and use it to modify the profile miaumiau:

Code: Select all

nano /root/.duply/miaumiau/conf
Now, if you are advanced enough to use your own GPG key, fine, you will probably know what to do, the other people will only need to encrypt their backups using a passphrase, the longer, the better: We will use here AbCdEf as before, but passes like INever-ForgetMy-Passwords are better.
You need to look at the file and locate this:

Code: Select all

GPG_KEY='_KEY_ID_'
And add a '#' in front of it. If you do use your own GPG key, leave it like that, but you already knew that :)
Next line should look like this:

Code: Select all

GPG_PW='AbCdEf'
Do not put '#' in front of it ! This instructs the program what password to use.
Now we need to set the ftp protocol and username/password as well as destination directories for back-up, all done in this line:

Code: Select all

TARGET='ftp://miaumiau:AbCdEf@bk02.prometeus.net/testduply'
where miaumiau is your ftp username, bk02.prometeus.net is your ftp host (IP is just as fine) /testduply is the directory where we will backup.
All that remains to do for manual backups is to add the source, and this will be /root/test:

Code: Select all

# base directory to backup
SOURCE='/root/test'
This is the most basic config, leave everything else unchanged. If you wish for more granularity, look at all the options and what they do, you can make incremental backups, delete them periodically, as specified above your public GPG key to encrypt (but this needs further tweaking for incremental backups to work as the previous backup must be decrypted first).
For most needs, the above should work, you can launch it with:

Code: Select all

duply miaumiau backup
Where miaumiau is the profile we created and edited above. Can also make the backups incremental by storing only the changes over the last one:

Code: Select all

duply miaumiau backup incr
Now, lets set-up automated incremental back-ups (incremental means that only a difference between the last back-up and the one we create now will be saved and in case of restore, it will be recomposed from those files). This saves space and traffic.
We will use for this the wonderful scheduling tool called cron. We will modify the crontab to include this task, I will make it run daily at midnight:

Code: Select all

crontab -e
Insert this line at the end:

Code: Select all

0 0 * * * duply miaumiau backup incr
first 0 stands for minute, second for hour, third char (*) for every day (can be 1-31) of the month, fourth for the month (1-12) and last for the day of the week (1-7) in case you wish to run it only Monday, for example. So, it says do this task every day of every month at 00:00 (midnight) What midnight is depends on your time zone.
Now, please do not set all to midnight Italy time, even not all at the top of the hour, for example, try something like:

Code: Select all

5 5 * * * duply miaumiau backup incr
Will be at 5 minutes past 5 every day. If everyone backs up their VPS at the same time there will be some ugly io activity slowing everyone down.
With minor changes (yum instead of apt-get) this can be done for centos based distributions too.
Please remember that back-ing up your files regularly is paramount in order to avoid the loss of data, be it as a hardware failure , hacking or simple neglect (overwrite or delete some important files by mistake). At times we can provide some back-up, but it will be unpleasant because will most likely not be at the time you needed it, will not be easy and may lack altogether (large storage servers we do not back-up at all). You should NEVER depend on your host's automated back-ups, be it Prometeus or any other, you know, there are 2 kinds of ppl in the world, those that didnt lose any important data and those that take regular back-ups. Let's make the third, those that do take regular back-ups without prior loss of data :)