Do jobs share artifacts?

Hi,

I have to setup a workflow that uses the macos vm, to install some binaries that is used to compute and generate some files for the remaining jobs.

Are the generated files/artifacts available in the next jobs?

Use-cases:

job1 and job2, run-on “macos-latest”. where in job1, a particular program is installed, for example “rust”. the job2 will run rust to compile the program, which source is cloned from a repository. The job1 and job2 are separate since job1 can be cached (doesn’t change much), while the job2 pulls from a repository that changes constantly.

job3 run-on “ubuntu” or a custom “docker container”, but needs the files generated by the program in job2 (not the binary that was compiled, but let’s say some source-files).

I’m currently waiting for a workflow to complete so hopefully will see the output and this will work as expected, otherwise please let me know!

Thank you!

The file system gets wiped (you get a fresh VM for each job), put uploaded artifacts are available as long as the jobs are in one workflow and have appropriate needs relationships. The documentation has an example for doing just that here: Passing data between jobs in a workflow

1 Like

Thanks! I end up finding about the download and upload.

I’m wondering what to do with some binaries, I’ll put that in a different thread.

2 Likes