How could I cache downloaded files (or complete directories) between multiple runs of a GitHub action?
I'm looking for something similar to what other CI systems such as Travis CI and Circle CI offer in order to save some time when downloading The Internet™ via Maven.
Solved! Solved! Go to Solution.
Would be particularly useful to have this to cache dependencies in `node_modules` (NPM) and `.m2` (Maven). Would speed my build by a dozen of minutes.
This would be good for caching pip downloads in Python. For example, Travis CI has
Thank you for being here, we recommend using artifacts for this specific case. Artifacts are the files created when you build and test your code. For example, artifacts might include binary or package files, test results, screenshots, or log files. Artifacts are associated with the workflow run where they were created and can be used by another job or deployed.
Our team created an action for uploading artifacts from your workflow here:
And for downloading them as well:
If you have any specific questions about either of those actions, we ask that you open an issue in the respective repository as our Actions engineering team monitors both repositories.
Mark helpful posts with Accept as Solution to help other users locate important info. Don't forget to give Kudos for great content!
I think the point of caching is to avoid downloading at all. I don't think downloading artifact will give significant margin againts installing dependencies, and it need additional step to upload. So artifact can't be used to speed up dependencies installation by signicant margin.
That sounds fine for specific files, but what about Docker cache? Are you suggesting we can upload/download the entire /var/lib/docker directory?
@duncan3dc, you have probably already found this out since you posted but leaving for anyone else who might happen on this discussion.
Both the upload and download of artifacts do work with whole directories. For instance, adding a second file in the upload-artifact example as...
steps: - uses: actions/checkout@v1 - run: mkdir -p path/to/artifact - run: echo hello > path/to/artifact/world_1.txt - run: echo hello > path/to/artifact/world_2.txt - uses: actions/upload-artifact@master with: name: my-artifact path: path/to/artifact
...does upload the whole directory with both files as a single artifact.
I can also verify that you cannot upload /var/lib/docker as you will receive access denied when trying to do so.
Regardless, the upload-artifact and download-artifact actions don't satisfy the original request as they do not perist from action execution to action execution.