Home » Continuous Delivery Resources » Github Actions: Steps in Containers

Github Actions: Steps in Containers

Build Pipelines and build agents using CI/CD products are not new. The latest generation of these have all moved to Docker. Docker creates a clean and flexible build environment that streamlines installation steps and tool combination issues.

This created a new problem, however. Either you must:

  • Make your own Docker build agent and install what you need; or,
  • Move artifacts from build containers to deploy containers.

Getting Node and the AWS cli on the same machine can be surprisingly complicated in a controlled environment.

Github actions has made a substantive improvement on this process. You can spin up a docker container with your codebase automatically mounted. The process results can be saved to the mounted directory and then the container goes away.

Photo by Zami from Pexels

The initial example

We recently did a project that utilized Github Actions for continuous integration / Deployment. This project followed a typical Cloudformation deployment process using AWS CLI. Those steps are:

  • Checkout the code
  • Configure access to your AWS environment
  • This requires Access Keys and sets them to the proper ENV variables
  • Deploy the stack
    -In our situation there was no code to build. That would have to happen before this deploy. And you would need AWS Sam or some variant.

Here is a basic workflow. Put the following code in .github/workflows in your repo and the action will run once that code is pushed.
on: push

name: Build on every push to the infrastructure

    name: Deployment
    runs-on: ubuntu-latest

    - name: Checkout
      uses: actions/checkout@v1

    - name: Configure AWS Credentials
      uses: aws-actions/configure-aws-credentials@v1
        aws-access-key-id: <Access Key: Use the secrets>
        aws-secret-access-key: <Secret Access Key: Use the secrets>
        aws-region: <Your Region: Use the secrets>

    - name: Deploy Base Stack
      run: aws cloudformation deploy \
        --no-fail-on-empty-changeset \
        --template-file cloudformation.yaml \
        --stack-name < STACK_NAME > \
        --parameter-overrides file://parameters-$(ENV).json

The ubuntu-latest environment gives you a VM that comes with the following installed:

  • Node
  • Go
  • Java / Maven
  • So very many more, it’s quite impressive

This build agent software list is impressively easy to use and a quick way to start, but not all that different from many competitors. The thing that sets Github Actions apart is an ability to use Docker within a single build step. Let’s focus on an example.

Language Validation with PHPStan

Our use case was to Lint a PHP repository. This is possible using ubuntu and installing tools, but let’s encapsulate it and keep PHPStan out of our Vendor directory for our delivered artifact.

on: push

name: Static Code Analysis

    name: Static Analysis with PHPStan
    runs-on: ubuntu-latest
    - name: Checkout
      uses: actions/checkout@v1
    - name: Analysis
      uses: docker://phpstan/phpstan
        args: analyze --error-format=table <code path>

Let’s break this down:

  • We pull a public docker container from Docker hub
  • We execute the container passing the args as the command.
  • Github actions will attach our workspace to the Workdir of the container and run the args command we passed
    • That means we can save the output and use it as an artifact
    • That means the installed dev tools won’t become part of the pipeline artifact.

This is pretty similar to provided Actions. We could also codify this into a callable action and register it either in our own repo or in a separately registered repo. The difference is this can call any container, and we don’t have to create a repository or separate part of the codebase for actions.

Running an arbitrary Docker container as a step

Docker containers as agent runners of a single build step is related to actions, but requires less overhead. This is the equivalent of an anonymous function in your CI/CD while abstracting the tool dependencies out of your pipeline execution.

Taking this idea further

Internal Tools and Secrets Abstraction

If you have internal containers with internal tools. You can host them on an Authenticated Docker Registry and add them to your pipeline. You can avoid putting code that might reveal secrets into your pipeline and instead call the container.


If you need code to be submitted internally and allow it to be updated by a separate team. You could also set permissions to obscure the container from the dev team altogether. They don’t have to bother with lots of code in the pipeline, and you get to manage your own compliance process.

And finally a Recap

Build pipelines and build agents with Docker are not new. But a specific step being run in a container and then cleaned so you can perform discrete steps with discrete tools is powerful. This is one tool in the CI/CD Quiver, but I’m excited about the abstraction possibilities. And cleanliness it provides.

Scroll to Top