.env to Github Environment Variables

A script for uploading dotenv files to Github environments: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 #!/bin/bash PAGER="" # Avoid pager when using zsh # Check if the correct number of arguments are passed if [ "$#" -ne 2 ]; then echo "Usage: $0 <org/repo> <environment> < ....

<span title='2024-03-21 00:00:00 +0000 UTC'>March 21, 2024</span>&nbsp;·&nbsp;2 min

An ECS -> RDS Securty Group Script

Below is a simple script to allow a user to alter RDS databases security groups to allow access from an ECS Service. Useful when we have an observability tool runing in ECS that wants to add RDS data connections. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 from typing import List, Dict import boto3 from botocore....

<span title='2024-02-08 00:00:00 +0000 UTC'>February 8, 2024</span>&nbsp;·&nbsp;3 min

Normalizing heterogeneous decimal Ion data in Athena

Recently, we exported data from a DynamoDB table to S3 in AWS Ion format. However, due to the fact that the DynamoDB table had varied formats for some numeric properties, the export serialized these numeric data columns in a few different formats: as a decimal (1234.), as an Ion decimal type (1234d0), and as a string ("1234"). However, we want to be able to treat these values as a bigint within our Athena queries....

<span title='2023-08-21 00:00:00 +0000 UTC'>August 21, 2023</span>&nbsp;·&nbsp;2 min

Auto-assume an IAM role before running a command

A convenience function to assume a IAM Role via STS before running a command. Add the following to your ~/.zshrc (or equivalent) file: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 function with-role { readonly role_arn=${1:?"The role_arn must be specified."} env -S $( aws sts assume-role \ --role-arn ${role_arn} \ --role-session-name ${USER} \ | \ jq -r '.Credentials | " AWS_ACCESS_KEY_ID=\(.AccessKeyId) AWS_SECRET_ACCESS_KEY=\(.SecretAccessKey) AWS_SESSION_TOKEN=\(.SessionToken) "' ) ${@:2} } This assumes that you have both the AWS CLI and jq installed....

<span title='2022-09-08 00:00:00 +0000 UTC'>September 8, 2022</span>&nbsp;·&nbsp;1 min

SSH tunnels in Python

At times, a developer may need to access infrastructure not available on the public internet. A common example of this is accessing a database located in a private subnet, as described in the VPC Scenario docs: Instances in the private subnet are back-end servers that don’t need to accept incoming traffic from the internet and therefore do not have public IP addresses; however, they can send requests to the internet using the NAT gateway....

<span title='2021-09-17 00:00:00 +0000 UTC'>September 17, 2021</span>&nbsp;·&nbsp;3 min

Getting area of WGS-84 geometries in SqKm

Getting area of geometries in WGS-84/EPSG:4326 in square kilometers: 1 2 3 4 SELECT ST_Area(geometry, false) / 10^6 sq_km FROM my_table

<span title='2021-07-17 00:00:00 +0000 UTC'>July 17, 2021</span>&nbsp;·&nbsp;1 min

Concurrent Python Example Script

Below is a very simple example of a script that I write and re-write more often than I would like to admit. It reads input data from a CSV and processes each row concurrently. A progress bar provides updates. Honestly, it’s pretty much just the concurrent.futures ThreadPoolExecutor example plus a progress bar.

<span title='2021-02-19 13:42:21 -0700 -0700'>February 19, 2021</span>&nbsp;·&nbsp;1 min

An ECR Deployment Script

Below is a simple script to deploy a Docker image to ECR… 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 set -e log () { local bold=$(tput bold) local normal=$(tput sgr0) echo "${bold}${1}${normal}" 1>&2; } if [ -z "${AWS_ACCOUNT}" ]; then log "Missing a valid AWS_ACCOUNT env variable"; exit 1; else log "Using AWS_ACCOUNT '${AWS_ACCOUNT}'"; fi AWS_REGION=${AWS_REGION:-us-east-1} REPO_NAME=${REPO_NAME:-my/repo} log "🔑 Authenticating....

<span title='2020-11-09 00:00:00 +0000 UTC'>November 9, 2020</span>&nbsp;·&nbsp;1 min

How to generate a database URI from an AWS Secret

A quick note about how to generate a database URI (or any other derived string) from an AWS SecretsManager SecretTargetAttachment (such as what’s provided via a RDS DatabaseInstance’s secret property). 1 2 3 4 5 6 7 8 9 10 11 db = rds.DatabaseInstance( # ... ) db_val = lambda field: db.secret.secret_value_from_json(field).to_string() task_definition.add_container( environment=dict( # ... PGRST_DB_URI=f"postgres://{db_val('username')}:{db_val('password')}@{db_val('host')}:{db_val('port')}/", ), # ... )

<span title='2020-06-15 00:00:00 +0000 UTC'>June 15, 2020</span>&nbsp;·&nbsp;1 min

Boilerplate for S3 Batch Operation Lambda

S3 Batch Operation provide a simple way to process a number of files stored in an S3 bucket with a Lambda function. However, the Lambda function must return particular Response Codes. Below is an example of a Lambda function written in Python that works with AWS S3 Batch Operations.

<span title='2019-12-20 00:00:00 +0000 UTC'>December 20, 2019</span>&nbsp;·&nbsp;1 min

Parsing S3 Inventory CSV output in Python

S3 Inventory is a great way to access a large number of keys in an S3 Bucket. Its output is easily parsed by AWS Athena, enabling queries across the key names (e.g. find all keys ending with .png) However, sometimes you just need to list all of the keys mentioned in the S3 Inventory output (e.g. populating an SQS queue with every keyname mentioned in an inventory output). The following code is an example of doing such task in Python:

<span title='2019-12-16 00:00:00 +0000 UTC'>December 16, 2019</span>&nbsp;·&nbsp;1 min

A PIL-friendly class for S3 objects

Here’s a quick example of creating an file-like object in Python that represents an object on S3 and plays nicely with PIL. This ended up being overkill for my needs but I figured somebody might get some use out of it.

<span title='2019-12-11 00:00:00 +0000 UTC'>December 11, 2019</span>&nbsp;·&nbsp;1 min

Using CloudFormation's Fn::Sub with Bash parameter substitution

Let’s say that you need to inject a large bash script into a CloudFormation AWS::EC2::Instance Resource’s UserData property. CloudFormation makes this easy with the Fn::Base64 intrinsic function: 1 2 3 4 5 6 7 8 9 10 11 12 AWSTemplateFormatVersion: '2010-09-09' Resources: VPNServerInstance: Type: AWS::EC2::Instance Properties: ImageId: ami-efd0428f InstanceType: m3.medium UserData: Fn::Base64: | #!/bin/sh echo "Hello world" In your bash script, you may even want to reference a parameter created elsewhere in the CloudFormation template....

<span title='2018-04-30 00:00:00 +0000 UTC'>April 30, 2018</span>&nbsp;·&nbsp;3 min

Serve an Esri Web AppBuilder web app from HTTP

When an Esri Web AppBuilder web app is configured with a portalUrl value served from HTTPS, the web app automatically redirects users to HTTPS when visited via HTTP. While this is best-practice in production, it can be a burden in development when you want to quickly run a local version of the web app. Below is a quick script written with Python standard libraries to serve a web app over HTTP....

<span title='2018-03-28 00:00:00 +0000 UTC'>March 28, 2018</span>&nbsp;·&nbsp;1 min

Learning AngularJS

Here is a quick dump of some of the better resources that I came across while learning AngularJS. StackOverflow: How to ’think in AngularJS’ - Great for getting the appropriate mindset. egghead.io’s AngularJS series by John Lindquist - Excellently cut up into discrete segments to cover fundamentals. Introduction to AngularJS - First in a series of developing an Angular app. Then watch [End to End](End to End with Angular JS), Security, Frontend Workflows, Testing

<span title='2014-03-22 00:00:00 +0000 UTC'>March 22, 2014</span>&nbsp;·&nbsp;1 min

SublimeText3 Setup

As I was transitioning from SublimeText2 to SublimeText3, it became apparent that I should keep a copy of my favorite text editor’s plugins and settings.

<span title='2014-02-03 00:00:00 +0000 UTC'>February 3, 2014</span>&nbsp;·&nbsp;1 min

Natural Language Toolkit Notes

I’ve been experimenting with Python’s Natural Language Toolkit, following along with Steven Bird, Ewan Klein, and Edward Loper’s book “Natural Language Processing with Python — Analyzing Text with the Natural Language Toolkit” (pdf version). So far, the book’s been great. As I’m going through the book, I’ve been writing down notes relating to the book’s examples. I’ve made a Github repo to store these notes and experiments that I may be doing using the NLTK here....

<span title='2013-08-25 00:00:00 +0000 UTC'>August 25, 2013</span>&nbsp;·&nbsp;1 min

pushd and popd forever

Becoming tired of typing paths repeatedly in the terminal, I realized that I should be using pushd and popd to be navigating directory structures. For those uninitiated, pushd changes your current directory in a similar fashion to cd but additionally adds the former directory to a stack. You can later return to the former directory by executing popd, popping it from the directory history. Unfortunately, the commands pushd and popd both require at least twice as many characters to type as cd and additionally come with the overhead of having to learnt o use a new command instead of something that is nearly instinctual....

<span title='2013-03-02 00:00:00 +0000 UTC'>March 2, 2013</span>&nbsp;·&nbsp;2 min

SSH Port Forwarding

The other week I found myself up at 2am in Canada setting up a VPN between my home computer (running Ubuntu) in Seattle and my laptop <partyhard.jpg>. I had enabled SSH access on my home computer and had set up port forwarding on my router to allow for access from the outside world ahead of time, but had forgotten that I would need to have a port forwarded for the VPN server as well....

<span title='2013-02-25 00:00:00 +0000 UTC'>February 25, 2013</span>&nbsp;·&nbsp;1 min