This post provides information about how to backup your data to AWS using the tool duply. Duply is a front-end for duplicity which encrypts your data before uploading it to S3.


Log in to AWS S3 and create a bucket for the backups (use a name of your choice).

Install ubuntu packages:

sudo add-apt-repository ppa:duplicity-team/ppa
sudo apt update
sudo apt install duplicity duply python3-boto

# Note: Replace python3-boto with python-boto if your python points to python2*.

Create profile “main”:

duply main create

Open $HOME/.duply/main/conf and edit options as described in the following.

Generate random password with openssl rand -base64 20 and set it as value for GPG_PW.

Generate gpg key with gpg --gen-key and use duply for name and previously created password for the gpg key.

Print keys with gpg --list-keys and find the public key id (last 8 chars in line that starts with “pub”). Set that as value for config option GPG_KEY.

Set GPG_OPTS to --pinentry-mode loopback.

Set TARGET options:

TARGET='s3://s3-<region endpoint name><bucket name>/<folder name>'
TARGET_USER='<your AWS access key ID>'
TARGET_PASS='<your AWS secret key>'

Replace “region endpoint name” with the endpoint name of the region of your bucket e.g. eu-central-1. You can find this name on the aws website. For “folder name” specify a folder of your choice.

Set SOURCE to the base directory of your backup for example /home/john.

Set MAX_AGE to 6m.

Save and close the file.

Open $HOME/.duply/main/exclude and add any directories or files you want to exclude:

- /home/*/.cache
- /home/*/.ccache

Run duply main backup to initiate the backup process.

Backup your $HOME/.duply directory to a safe location. For example, you can create a tar with tar cvf duply.tar $HOME/.duply and add it as an attachment to a password entry in keepassx.

The $HOME/.duply directory contains everything you need to restore your files.


Use duply main backup to run the backup.

Use duply main restore /restored_files to restore the latest backup to /restored_files.

If any system directories are part of your backup, restore them to a temporary location and move them to their target location from a live system to prevent problems with the running system.

Use duply main fetch docs/myfile /home/me/myfile to restore a specific file or directory.

To restore from a specific date use duply main status to find the exact date and restore with duply main restore /restored_files '2013-11-08T07:38:30'.

Automatic backup

Add this to crontab (crontab -e).

@daily /usr/bin/duply main backup


Files owned by root in your home directory

If you’re backing up your home directory and there are files owned by root inside, duply won’t be able to read them if run by your user. You can reassign ownership of all files in your home directory to your user with sudo chown -R $USER ~.

Or run duply with sudo. In this case, duply might look for the specified profile in /root/.duply. If that happens, just specify the profile as a path e.g. sudo duply ~/.duply/main backup.

gpg: decryption failed: No secret key

To me this happened because I changed the GPG key after already having files backed up. Duply retrieves a file from your backup and decrypts it to ensure that decryption is still working. Emptying the S3 bucket solved the problem for me as duply then did a full backup. Alternatively, you could switch to a different bucket when you change the GPG key.

There might be other reasons for this message, e.g. problems with GPG.