Caching files between GitHub Action executions

That’s actually something docker buildkit allows, but very few docker repositories support just yet
And the version of docker on the nodes is rather old too

1 Like

@duncan3dc, you have probably already found this out since you posted but leaving for anyone else who might happen on this discussion.

Both the upload and download of artifacts do work with whole directories. For instance, adding a second file in the upload-artifact example as…

- uses: actions/checkout@v1

- run: mkdir -p path/to/artifact

- run: echo hello > path/to/artifact/world_1.txt
- run: echo hello > path/to/artifact/world_2.txt

- uses: actions/upload-artifact@master
    name: my-artifact
    path: path/to/artifact

…does upload the whole directory with both files as a single artifact.

I can also verify that you  cannot upload  /var/lib/docker as you will receive access denied when trying to do so.

1 Like

Regardless, the  upload-artifact and  download-artifact actions don’t satisfy the original request as they do not perist from action execution to action execution.


Is the artifact unique per branch and PR, as it is with Travis caches? says:

  • Travis CI fetches the cache for every build, including branches and pull requests.
  • If a branch does not have its own cache, Travis CI fetches the cache of the repository’s default branch.
  • There is one cache per branch and language version/ compiler version/ JDK version/ Gemfile location/ etc.
  • Only modifications made to the cached directories from normal pushes are stored.

Based on the terse conversation at, it looks like download-artifact is not a solution to caching across invocations of workflows.

My use case is a project using Go, where I would like to preserve GOCACHE from one Workflow invocation to another. This particular project is taking about 11 minutes to run go test -race ./.... If it could use the build cache from a previous run, I would expect it to take less than a minute, as many of the test results would be cached and most of the compilation results would be cached too.

Not being able to transfer GOCACHE between runs is a significant hindrance to adopting GitHub Actions.


We appreciate the feedback, it’s clear to us that this is necessary.  We’re working on caching packages and artifacts between workflow executions, we’ll have it by mid-November.


That’s great to hear! We weren’t comfortable moving over from Circle without this.

Looking forward to it.


Awesome! Thanks :slight_smile:


We are also waiting this to move completely. <3


Glad you guys are working on this as well, I use GitHub Actions to build some intermediary packages and caching is the only way to do it since artifacts won’t really cut it for large objects. It’d take too long to upload.


Looking forward to it! My use case:

On a macOS instance, cache some Node global dependencies expo-cli, react-native, and @sentry/cli. Possibly also some CocoaPods.

1 Like

That’s cool man, just got accepted in the beta and really loving it!

When this is live I might get rid of CircleCI all together :smile:!


For those who can’t wait the official cache solution, implementing one is not that hard.

You need an external storage. I used an S3 bucket for that.

Do a aws s3 sync \<remote\> \<local\> before the build, then a aws s3 sync \<local\> \<remote\> after the build.

Build must be smart enough to use the files you download from remote location.

It’s working well for me.


any updates on this ? will this be a part of Nov 13 release ?
eagerly waiting for this.
upload artifacts failing for me with this error

Now that there is a release date for the General Availability of GitHub Actions, can we expect that caching support will follow soon after that or that it won’t be delayed?

I know that on a pure level, a true isolated environment for running tests require not sharing dependencies per build or even per job, but for private projects it would be a shame to waste billable minutes for the hosted runner minutes on repeatedly downloading the same files over and over again.




Individual caches are limited to 200MB

So you can cache about a third of a typical node_modules?


Can the new cache action be used to cache docker images, or will that be separate?

1 Like

Not unless they’re under 200mb :frowning:

The cache limit is for the compressed tar.gz, so for example a 766 MB node_modules caches fine.