Uploading Massive Backups to Amazon Glacier via boto

This is an example of how to use the boto library in Python to perform large, multipart, concurrent uploads to Amazon Glacier.

Notes

  1. The current version of the library (2.38.0) is broken for Python 2.7, for multipart uploads.
  2. The version of the library that we’re using for multipart uploads (2.29.1) is broken for Python 3, as are all other adjacent versions.
  3. Because of (1) and (2), we’re using version 2.29.1 under Python 2.7 and suggest that you do the same.

Example

#!/usr/bin/env python2.7

import os.path

import boto.glacier.layer2

def upload(access_key, secret_key, vault_name, filepath, description):
l = boto.glacier.layer2.Layer2(
aws_access_key_id=access_key,
aws_secret_access_key=secret_key)

v = l.get_vault(vault_name)

archive_id = v.concurrent_create_archive_from_file(
filepath,
description)

print(archive_id)

if __name__ == '__main__':
access_key = 'XXX'
secret_key = 'YYY'
vault_name = 'images'
filepath = '/mnt/array/backups/big_archive.xz'
description = os.path.basename(filepath)

upload(access_key, secret_key, vault_name, filepath, description)
Advertisements