What's the recommended way to pass a Docker image to the next job in a workflow

In my case I have a workflow with four jobs:

  1. Job 1 builds a Docker image and then pushes to it Dockerhub.
  2. Job 2 runs unit tests.
  3. Job 3 runs security scans.
  4. Job 4 waits for the above 3 jobs to finish and then pulls the built Docker image from Dockerhub and runs some smoke tests on it.

Pulling the Docker image in job 4 wastes a lot of time. Is there a recommended way for passing Docker images between jobs in a workflow?

Ideally we’d like to combine job 1 and 4 for the best performance, but because of job 2 and 3 being a dependency of job 4, we can’t do that.

15 Likes

As different jobs use different runners, so we can’t pass docker image between jobs.

Is it possible to share dockerfile between job1 and job4 in your scenario, and then build it in job4 again to generate docker image instead of pulling it from docker hub?  

You can use the upload-artifact and download-artifact actions to share data between jobs in a workflow.

https://help.github.com/en/actions/automating-your-workflow-with-github-actions/persisting-workflow-data-using-artifacts#passing-data-between-jobs-in-a-workflow

Pulling the image is faster than re-building the image and using the upload-artifact/download-artifact actions. I’ve tried both.

1 Like

Github provided Github Package Registry which could hold your docker image like docker hub registry. If you are using public repo, it is free. But there are some limits for private repo.

If you’d like to take it into consideration, you could follow this document to push to and pull from Github Package Registry.

https://help.github.com/en/github/managing-packages-with-github-packages/configuring-docker-for-use-with-github-packages#further-reading

@yanjingzhu that still has the same problem I mentioned in the original post.

Sorry to tell that there are no other methods to pass Docker image to next job in a workflow. 

I think this is a great feature request. it is a common pattern to build a docker image then run tests in that docker image as a seperate and distinct job step. Other git hosting services have this feature in thier CI/CD piplines.

26 Likes

Can you use steps instead of jobs? Steps can reuse artifacts from previous steps and it sounds like the OP doesnt have any need for running jobs in parallel, its a sequential workflow.

it sounds like the OP doesnt have any need for running jobs in parallel, its a sequential workflow.

It’s not a sequential workflow since jobs 1, 2, and 3 run in parallel and job 4 waits for all of them to pass.

5 Likes

Yes we should be able to build an image and then use it in subsequent steps, that’s vital to automating CI workflow.

7 Likes

+1

Need this

7 Likes

Kind of shocked that this doesn’t exist. I mean seriously? This is kind of a hallmark of any good CI/CD system. Why would you go and develop GitHub actions in a box like this without borrowing from the many examples out there of modern CI/CD systems?

3 Likes

Yes this feature is critical in a CI framework, some support for it would be very helpful!

+1 on criticality for container based workflows (if the container is my artifact, then “just rebuild the container in a later job/step” means your not really testing the built artifact, just something that probably is identical.)

@sean-krail As a workaround, you could build, tag with a unique ID, push the image to a registry, save the ID as an artifact, pass the artifact to subsequent jobs and pull that image using the ID passed along as an artifact. It’s a bit clunky, but it should work.

I came up with the following solution.

I run a registry instance in the build job of my GitHub Workflow and then share the volume of this registry as an artifact.

That way I can download the artifact in other jobs and rerun the registry for pulling the image.

You can find a full example here: https://github.com/cdalvaro/docker-salt-master/blob/1b862039dc0acd709e972d3619c57b9ef614d6b5/.github/workflows/build-and-test.yml

The ideal solution would be to have a registry service shared for all jobs instead of having it restricted to a specific step.

You could just use docker save and docker load to save as a tar and pass like any other artifact https://docs.github.com/en/free-pro-team@latest/actions/guides/storing-workflow-data-as-artifacts

I tried to make that way, but I was not able to do that because with buildx the outputs option didn’t work to export a multi-platform image. The only way was pushing it into a registry.

Currently facing the same issue and right now docker is unusable on public repos. There is no easy way to pass images between jobs outside artifacts. Which means that we can’t use the jobs.<job_id>.steps.uses syntax but have to run docker commands in the steps shell. For security reasons PR opened from forks don’t contain any secrets, meaning that you can’t use any registry either.

I think being able to push image to github packages from workflows without requiring tokens or secrets might help a little bit.

@sean-krail Regarding the speed of upload/download of artifacts

Uploading 62MB takes 25 seconds and Downloading 62MB takes 3 seconds.

Uploading 348MB takes 2m15s and Downloading 348MB takes 16 seconds.

I’ve tried using gzip, but the image size was reduced to 345MB so it’s negligible and might be significant for larger images.

Given the fact that uploading (push) is done once and downloading (pull) is done multiple times, I’m quite happy with the speed.

Even if it takes 5 minutes to upload your image, it’ll take 30 seconds to download the image (more or less). I assume that a 1GB image takes more than 30 seconds to build (when there’s no cache), hence it’s better to use upload/download.

My conclusion - use upload/download artifacts, here’s how I did it in

test.yml

on: [push]

jobs:
  docker_build:
    runs-on: ubuntu-latest
    name: Docker Build
    steps:
      - uses: actions/checkout@v2
      - name: Inject slug/short variables
        uses: rlespinasse/github-slug-action@v3.x    
      - name: Build Image
        env:
          DOCKER_ORG: unfor19
          DOCKER_REPO: install-aws-cli-action
          DOCKER_TAG: ${{ env.GITHUB_REF_SLUG }}
        run: |
          export DOCKER_FULLTAG="${DOCKER_ORG}/${DOCKER_REPO}:${DOCKER_TAG//\\//-}"
          docker build -t "$DOCKER_FULLTAG" .
          mkdir -p path/to/artifacts
          docker save "$DOCKER_FULLTAG" > path/to/artifacts/docker-image.tar
          echo "$DOCKER_FULLTAG" > path/to/artifacts/docker-tag
      - uses: actions/upload-artifact@v2
        with:
          name: docker-artifact
          path: path/to/artifacts

  test_latest_version_v1:
    needs: docker_build
    runs-on: ubuntu-latest
    name: latest v1
    env:
      AWS_CLI_VERSION: 1   
    steps:    
      - name: Download Docker Image (Artifact)
        uses: actions/download-artifact@v2
        with:
          name: docker-artifact
          path: path/to/artifacts
      - name: Run test in Docker
        run: |
          cd path/to/artifacts
          docker load < docker-image.tar
          export DOCKER_FULLTAG=$(cat docker-tag)
          docker run --rm "$DOCKER_FULLTAG" $AWS_CLI_VERSION
      - uses: actions/checkout@v2 
      - name: Run test on Runner
        run: |
          sudo ./entrypoint.sh
1 Like