My backup plan was to use LiveDrive FTP backup with Duplicity to backup my documents, configs, websites and other important files.

So recently I found my ISP was offering free Livedrive accounts so I thought I would give it a go.  As I run a Debian server at home and on my VPS, I needed the Briefcase option which allows FTP – brilliant I thought.

I knew this would be too easy.  The FTP connection to LiveDrive is flakey at best.  For example I uploaded a 340MB file via FTP – this was fine.  But try and download it and you are met with a ‘426 Download failed’ error.  Great.  So I can upload files, but not retrieve them.   To eliminate any local issues I tried this from a couple of different locations, connections, computers – but no. The error persists.  I have a number of 250MB files (I set my Duplicity file size to this), and they mostly work OK.  Sometimes the connection breaks when uploading or downloading, and often just retying some hours later works fine, but I don’t like the inconsistency.

I tried using WEBDAVS – this produces the same result – smaller files are OK, but it trips up with a ‘500 internal server’ error when I am trying to download the same (or another similarly sized) file.

There are lots of web results of similar tales of woe, but no resolutions.  I have emailed their support, so hopefully they can come up with a resolution.

Following my latest indulgence to online/offsite backups (Amazon S3 and s3sync) I thought I should update the situation.  It is painfully slow when running a backup.  It should be just uploading changed files, but I guess crawling many files (~30,000) over numerous directories and doing the comparison with Amazon just doesn’t work nicely.  It would be fine for smaller numbers of files, but unfortunately not for my case.

I then tried s3cmd (Python based) with the sync option – quicker, but not great – it could still take 4-6 hours to do the entire backup run, even if only a handful of files have been changed.

I needed to think about if I was doing it the right way – backing up files natively.  I then came across tarsnap which uses S3 for storage – whilst I like the principle, the added costs work out nearly 3 times the cost of Amazon, as backups proxy through their servers to maintain the tars.  Nice idea, but too costly for me.

I then found duplicity whilst still beta it looks promising.  I am trialling it with a small data set for the moment, but initial impressions are good – it holds files in an index which is uploaded as a separate file, and handles incremental very well.  I have encrpyted using GPG, so there is some compression happening here as well.  I will update when I have run it for a month..