Caching files between GitHub Action executions

Based on the terse conversation at https://github.com/actions/download-artifact/issues/3, it looks like download-artifact is not a solution to caching across invocations of workflows.

My use case is a project using Go, where I would like to preserve GOCACHE from one Workflow invocation to another. This particular project is taking about 11 minutes to run go test -race ./.... If it could use the build cache from a previous run, I would expect it to take less than a minute, as many of the test results would be cached and most of the compilation results would be cached too.

Not being able to transfer GOCACHE between runs is a significant hindrance to adopting GitHub Actions.

2 Likes

We appreciate the feedback, it’s clear to us that this is necessary.  We’re working on caching packages and artifacts between workflow executions, we’ll have it by mid-November.

193 Likes

That’s great to hear! We weren’t comfortable moving over from Circle without this.

Looking forward to it.

8 Likes

Awesome! Thanks :slight_smile:

1 Like

We are also waiting this to move completely. <3

2 Likes

Glad you guys are working on this as well, I use GitHub Actions to build some intermediary packages and caching is the only way to do it since artifacts won’t really cut it for large objects. It’d take too long to upload.

2 Likes

Looking forward to it! My use case:

On a macOS instance, cache some Node global dependencies expo-cli, react-native, and @sentry/cli. Possibly also some CocoaPods.

1 Like

That’s cool man, just got accepted in the beta and really loving it!

When this is live I might get rid of CircleCI all together :smile:!

2 Likes

For those who can’t wait the official cache solution, implementing one is not that hard.

You need an external storage. I used an S3 bucket for that.

Do a aws s3 sync \<remote\> \<local\> before the build, then a aws s3 sync \<local\> \<remote\> after the build.

Build must be smart enough to use the files you download from remote location.

It’s working well for me.

6 Likes

any updates on this ? will this be a part of Nov 13 release ?
eagerly waiting for this.
upload artifacts failing for me with this error

Now that there is a release date for the General Availability of GitHub Actions, can we expect that caching support will follow soon after that or that it won’t be delayed?

I know that on a pure level, a true isolated environment for running tests require not sharing dependencies per build or even per job, but for private projects it would be a shame to waste billable minutes for the hosted runner minutes on repeatedly downloading the same files over and over again.

4 Likes

Yay! https://github.com/actions/cache

8 Likes

Individual caches are limited to 200MB

So you can cache about a third of a typical node_modules?

5 Likes

Can the new cache action be used to cache docker images, or will that be separate?

1 Like

Not unless they’re under 200mb :frowning:

The cache limit is for the compressed tar.gz, so for example a 766 MB node_modules caches fine.

See https://github.com/actions/cache/issues/6#issuecomment-548535174.

@hugovk wrote:
The cache limit is for the compressed tar.gz, so for example a 766 MB node_modules caches fine.

Doesn’t work for us, uncompressed is 587M:

##[warning]Cache size of 235598459 bytes is over the 200MB limit, not saving cache

Follow-up: we’ve shipped caching.  Please see the cache action for information and documentation.

6 Likes

We do multi-platform C++ builds that depend on Qt and a variety of other packages fetched via vcpkg.  The windows dependencies alone are over 5 GB.  Adding Android and Mac builds on top of that means that the 2 GB per repository limit is completely useless for us.  

3 Likes

Exactly.

Also would be nice to have a simple ‘incremental’ workflow switch to get fast incremental builds.

1 Like