Blog CD Pipeline with AWS CodePipeline 22 Nov 2017

Jumped out of order from my earlier checklist and set up some automagic build and deploy. I'd wanted an excuse to try out CodePipeline, so this was it!

So, how does this blog work? It is deployed to an S3 bucket (skife.org) with CloudFront in front of it. CloudFront is set up to use the free SNI certs to provide TLS. Previously, I pushed manually via s3cmd, which worked well with some incantation fiddling.

I won't write a full CodeBuild and CodeDeploy tutorial, Amazon has that well covered, but a couple bits were funny to work out, so will talk about those.

First, CodePipeline needs to trigger things. This is important as CodeBuild has no mechanism (which I could find) to only care about particular branches. CodeBuild really just does builds (kind of). Conceptually, do everything through CodePipeline and other stuff is just steps which react to the pipeline.

Given this is a static site, the build step just builds a tarball:

version: 0.2

phases:
  install:
    commands:
      - wget https://github.com/gohugoio/hugo/releases/download/v0.31/hugo_0.31_Linux-64bit.deb
      - dpkg -i ./hugo_0.31_Linux-64bit.deb
  build:
    commands:
      - hugo
      - tar -C public -cvzf skife.org.tgz .
artifacts:
  files:
    - skife.org.tgz

Nothing fancy here, but having the artifact for CodePipeline to pass around is important.

My first pass just had the deploy at the end of the build, but I want to be able to insert some basic tests before I deploy new versions. Just things like link verification, HTML5 validation, and maybe running stuff like Lighthouse against a test instance before letting it out. The test part :-) Because of this, I wanted to seperate build from deploy. It turns out "deploy by copying into an S3 bucket" is not a thing CodeDeploy has any concept for.

So my "deploy" is just another CodeBuild build:

version: 0.2
phases:
  build:
    commands:
      - tar -xf skife.org.tgz
      - rm skife.org.tgz
      - aws s3 sync . s3://skife.org --acl public-read --cache-control public,max-age=600

You can feed one build to another, it doesn't mind. When I configured the second build I had to set it up with an output artifact or CodePipeline wouldn't let me add it. After I saved the CodePipeline changes, I could go back and remove that output. The other decent path is probably to set up a Lambda function that takes apart the tarball and copies things over... but this build approach seems simpler.

I tried to put a cloudfront invalidation into the last step as well, but the version of the aws cli on the build image is old and it is not supported. I'll sort that out later. Once I do, will change max-age to max-age=31536000 or so and add something like:

- aws cloudfront create-invalidation --distribution-id E1DTTO3T6ZPN9M --paths / /index.html /404.html /archive.html /index.xml

to the build commands, and voila!