Caching files between GitHub Action executions

any updates on this ? will this be a part of Nov 13 release ?
eagerly waiting for this.
upload artifacts failing for me with this error

Now that there is a release date for the General Availability of GitHub Actions, can we expect that caching support will follow soon after that or that it won’t be delayed?

I know that on a pure level, a true isolated environment for running tests require not sharing dependencies per build or even per job, but for private projects it would be a shame to waste billable minutes for the hosted runner minutes on repeatedly downloading the same files over and over again.

4 Likes

Yay! https://github.com/actions/cache

8 Likes

Individual caches are limited to 200MB

So you can cache about a third of a typical node_modules?

5 Likes

Can the new cache action be used to cache docker images, or will that be separate?

1 Like

Not unless they’re under 200mb :frowning:

The cache limit is for the compressed tar.gz, so for example a 766 MB node_modules caches fine.

See https://github.com/actions/cache/issues/6#issuecomment-548535174.

@hugovk wrote:
The cache limit is for the compressed tar.gz, so for example a 766 MB node_modules caches fine.

Doesn’t work for us, uncompressed is 587M:

##[warning]Cache size of 235598459 bytes is over the 200MB limit, not saving cache

Follow-up: we’ve shipped caching.  Please see the cache action for information and documentation.

6 Likes

We do multi-platform C++ builds that depend on Qt and a variety of other packages fetched via vcpkg.  The windows dependencies alone are over 5 GB.  Adding Android and Mac builds on top of that means that the 2 GB per repository limit is completely useless for us.  

3 Likes

Exactly.

Also would be nice to have a simple ‘incremental’ workflow switch to get fast incremental builds.

1 Like