Hosting websites from private S3 buckets

At work, we were alerted to an outage of an S3-backed frontend. The frontend was returning 403 responses. This left us scratching our head, as no deployment had occurred recently. After doing some digging, we found that AWS account administrators had applied a new policy to make all S3 buckets private (this is an account-wide setting, overriding bucket-level settings). 🆒 🆒 🆒 So how can we configure Cloudfront to access private S3 buckets?...

April 19, 2024 · 4 min

An ECS -> RDS Security Group Script

Below is a simple script to allow a user to alter RDS databases security groups to allow access from an ECS Service. Useful when we have an observability tool runing in ECS that wants to add RDS data connections. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 from typing import List, Dict import boto3 from botocore....

February 8, 2024 · 3 min

Normalizing heterogeneous decimal Ion data in Athena

Recently, we exported data from a DynamoDB table to S3 in AWS Ion format. However, due to the fact that the DynamoDB table had varied formats for some numeric properties, the export serialized these numeric data columns in a few different formats: as a decimal (1234.), as an Ion decimal type (1234d0), and as a string ("1234"). However, we want to be able to treat these values as a bigint within our Athena queries....

August 21, 2023 · 2 min

Security-conscious cloud deployments from Github Actions via OpenID Connect

Goals This ticket is focused on how we can securely deploy to a major cloud provider environment (e.g. AWS, Azure, GCP) from within our Github Actions workflows. Why is this challenging? A naive solution to this problem is to generate some cloud provider credentials (e.g. AWS Access Keys) and to store them as a Github Secret. Our Github Actions can then utilize these credentials in its workflows. However, this technique contains a number of concerns:...

December 20, 2021 · 6 min

An ECR Deployment Script

Below is a simple script to deploy a Docker image to ECR… 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 set -e log () { local bold=$(tput bold) local normal=$(tput sgr0) echo "${bold}${1}${normal}" 1>&2; } if [ -z "${AWS_ACCOUNT}" ]; then log "Missing a valid AWS_ACCOUNT env variable"; exit 1; else log "Using AWS_ACCOUNT '${AWS_ACCOUNT}'"; fi AWS_REGION=${AWS_REGION:-us-east-1} REPO_NAME=${REPO_NAME:-my/repo} log "🔑 Authenticating....

November 9, 2020 · 1 min

Using CloudFront as a Reverse Proxy

Alternate title: How to be master of your domain. The basic idea of this post is to demonstrate how CloudFront can be utilized as a serverless reverse-proxy, allowing you to host all of your application’s content and services from a single domain. This minimizes a project’s TLD footprint while providing project organization and performance along the way. Why Within large organizations, bureaucracy can make it a challenge to obtain a subdomain for a project....

October 2, 2020 · 10 min

How to generate a database URI from an AWS Secret

A quick note about how to generate a database URI (or any other derived string) from an AWS SecretsManager SecretTargetAttachment (such as what’s provided via a RDS DatabaseInstance’s secret property). 1 2 3 4 5 6 7 8 9 10 11 db = rds.DatabaseInstance( # ... ) db_val = lambda field: db.secret.secret_value_from_json(field).to_string() task_definition.add_container( environment=dict( # ... PGRST_DB_URI=f"postgres://{db_val('username')}:{db_val('password')}@{db_val('host')}:{db_val('port')}/", ), # ... )

June 15, 2020 · 1 min

Tips for working with a large number of files in S3

I would argue that S3 is basically AWS’ best service. It’s super cheap, it’s basically infinitely scalable, and it never goes down (except for when it does). Part of its beauty is its simplicity. You give it a file and a key to identify that file, you can have faith that it will store it without issue. You give it a key, you can have faith that it will return the file represented by that key, assuming there is one....

May 30, 2020 · 7 min

Boilerplate for S3 Batch Operation Lambda

S3 Batch Operation provide a simple way to process a number of files stored in an S3 bucket with a Lambda function. However, the Lambda function must return particular Response Codes. Below is an example of a Lambda function written in Python that works with AWS S3 Batch Operations.

December 20, 2019 · 1 min

Using CloudFormation's Fn::Sub with Bash parameter substitution

Let’s say that you need to inject a large bash script into a CloudFormation AWS::EC2::Instance Resource’s UserData property. CloudFormation makes this easy with the Fn::Base64 intrinsic function: 1 2 3 4 5 6 7 8 9 10 11 12 AWSTemplateFormatVersion: '2010-09-09' Resources: VPNServerInstance: Type: AWS::EC2::Instance Properties: ImageId: ami-efd0428f InstanceType: m3.medium UserData: Fn::Base64: | #!/bin/sh echo "Hello world" In your bash script, you may even want to reference a parameter created elsewhere in the CloudFormation template....

April 30, 2018 · 3 min