Cleanup Large S3 Buckets

I found a neat python tool called s3wipe which brings significant speed improvements when deleting extremely large s3 buckets. It achieves this by using multiple threads and batch deletes. This really helped me out recently when deleting buckets containing several million objects and versions. Example Usage Empty a bucket of all objects, and delete the bucket when done. BUCKET_NAME=project-files-public docker run -it --rm slmingol/s3wipe \ --id ${AWS_ACCESS_KEY_ID} \ --key ${AWS_SECRET_ACCESS_KEY} \ --path "s3://${BUCKET_NAME}" \ --delbucket Remove all objects and versions with a certain prefix, but retain the bucket....

September 20, 2019

How to generate temporary download links to S3 objects

S3 has a feature which allows you to generate signed URLs which are valid only for a predefined period of time. This makes it much safer to distribute URLs via email/slack etc.. Process Find the object in the S3 console and note the bucket name and object path. Ensure your AWS credentials are loaded into your environment. Use the AWS CLI to create a pre-signed URL: # TTL is the number of seconds until the URL expires....

March 5, 2019

How to build self-deploying applications with Terraform and BitBucket Pipelines.

Background A few weeks ago I decided to replace my ageing and bloated Drupal 7 blog. I decided on the following criteria that the solution had to meet: The project git repo must be private. Hosting infrastructure had to be under my control and completely codified. The solution should require very little supporting infrastructure such as databases. Deployment of changes to the site or infrastructure must be automated. These requirements immediately ruled out a few options including GitHub Pages and SaaS blogging platforms like wordpress....

November 17, 2016