What's the recommended way to pass a Docker image to the next job in a workflow

In my case I have a workflow with four jobs:

  1. Job 1 builds a Docker image and then pushes to it Dockerhub.
  2. Job 2 runs unit tests.
  3. Job 3 runs security scans.
  4. Job 4 waits for the above 3 jobs to finish and then pulls the built Docker image from Dockerhub and runs some smoke tests on it.

Pulling the Docker image in job 4 wastes a lot of time. Is there a recommended way for passing Docker images between jobs in a workflow?

Ideally we’d like to combine job 1 and 4 for the best performance, but because of job 2 and 3 being a dependency of job 4, we can’t do that.

14 Likes

As different jobs use different runners, so we can’t pass docker image between jobs.

Is it possible to share dockerfile between job1 and job4 in your scenario, and then build it in job4 again to generate docker image instead of pulling it from docker hub?  

You can use the upload-artifact and download-artifact actions to share data between jobs in a workflow.

https://help.github.com/en/actions/automating-your-workflow-with-github-actions/persisting-workflow-data-using-artifacts#passing-data-between-jobs-in-a-workflow

Pulling the image is faster than re-building the image and using the upload-artifact/download-artifact actions. I’ve tried both.

1 Like

Github provided Github Package Registry which could hold your docker image like docker hub registry. If you are using public repo, it is free. But there are some limits for private repo.

If you’d like to take it into consideration, you could follow this document to push to and pull from Github Package Registry.

https://help.github.com/en/github/managing-packages-with-github-packages/configuring-docker-for-use-with-github-packages#further-reading

@yanjingzhu that still has the same problem I mentioned in the original post.

Sorry to tell that there are no other methods to pass Docker image to next job in a workflow. 

I think this is a great feature request. it is a common pattern to build a docker image then run tests in that docker image as a seperate and distinct job step. Other git hosting services have this feature in thier CI/CD piplines.

26 Likes

Can you use steps instead of jobs? Steps can reuse artifacts from previous steps and it sounds like the OP doesnt have any need for running jobs in parallel, its a sequential workflow.

it sounds like the OP doesnt have any need for running jobs in parallel, its a sequential workflow.

It’s not a sequential workflow since jobs 1, 2, and 3 run in parallel and job 4 waits for all of them to pass.

5 Likes

Yes we should be able to build an image and then use it in subsequent steps, that’s vital to automating CI workflow.

7 Likes

+1

Need this

7 Likes

Kind of shocked that this doesn’t exist. I mean seriously? This is kind of a hallmark of any good CI/CD system. Why would you go and develop GitHub actions in a box like this without borrowing from the many examples out there of modern CI/CD systems?

3 Likes

Yes this feature is critical in a CI framework, some support for it would be very helpful!

+1 on criticality for container based workflows (if the container is my artifact, then “just rebuild the container in a later job/step” means your not really testing the built artifact, just something that probably is identical.)

@sean-krail As a workaround, you could build, tag with a unique ID, push the image to a registry, save the ID as an artifact, pass the artifact to subsequent jobs and pull that image using the ID passed along as an artifact. It’s a bit clunky, but it should work.

I came up with the following solution.

I run a registry instance in the build job of my GitHub Workflow and then share the volume of this registry as an artifact.

That way I can download the artifact in other jobs and rerun the registry for pulling the image.

You can find a full example here: https://github.com/cdalvaro/docker-salt-master/blob/1b862039dc0acd709e972d3619c57b9ef614d6b5/.github/workflows/build-and-test.yml

The ideal solution would be to have a registry service shared for all jobs instead of having it restricted to a specific step.

You could just use docker save and docker load to save as a tar and pass like any other artifact https://docs.github.com/en/free-pro-team@latest/actions/guides/storing-workflow-data-as-artifacts

I tried to make that way, but I was not able to do that because with buildx the outputs option didn’t work to export a multi-platform image. The only way was pushing it into a registry.

Currently facing the same issue and right now docker is unusable on public repos. There is no easy way to pass images between jobs outside artifacts. Which means that we can’t use the jobs.<job_id>.steps.uses syntax but have to run docker commands in the steps shell. For security reasons PR opened from forks don’t contain any secrets, meaning that you can’t use any registry either.

I think being able to push image to github packages from workflows without requiring tokens or secrets might help a little bit.