How to synchronize workflows or use artfacts across workflows?


I would like to start a workflow when all artifacts from a previous workflow have been published and are publicly available. This looks simple but I have not found any proper solution so far.

The first workflow is a scheduled one, the nightly builds of a project. The second workflow sends a trigger to a server which pulls the artifacts of the first workflow and publish them on its Web site. Simple request: rebuild a project every night and publish the binaries.

I tried several approaches without success. The current setup is very conservative: schedule the “publish” workflow three hours after the “build” workflow, hoping that the “build” phase would complete in the meantime.

It worked so far until the last few days where the “build” workflow was started up to seven hours behind schedule and the “publish” one only four hours behind schedule. The “publish” workflow being actually launched before the “build” one, nothing is published of course. This happened two days in a row.

Scheduled times:

  • Nightly build workflow (3 jobs): 01:10 AM
  • Publish artifacts: 04:10 AM

Sample start times on May 31st 2021:

  • Nightly build workflow:
    • Windows build job: 08:25 AM
    • Linux build job: 06:35 AM
    • Documentation build job: 06:28 AM
  • Publish artifacts: 07:59

I have no objection to workflow being started late, or even very late. Resources are limited, workflows are queued and the service is free anyway. But I do need a deterministic way to serialize the “build” and “publish” phases.

Initially, I wanted to trigger the publication in the same workflow as the build, after the upload-artifact actions, as a last job, depending on all others. But the artifact is actually published after the completion of the workflow only (reported by several users here). So, the trigger would occur before the publication of the artifact.

Then, I looked for a way to explicitly synchronize or serialize workflows and found none.

Finally, I used the conservative method which consists in scheduling the two workflows independently with a sufficiently long delay between the two. But this method is not deterministic and no longer works due to unexpected huge delays in the workflow scheduling.

So, how would you start a job after the actual publication of the artifacts of a workflow, in a deterministic way? This could be done in the same workflow or in two different workflows, I do not really care. I just need to safely download artifacts when they are ready, from an public server.

Using two jobs in one workflow with a needs relationship between them should do it, in that case the second job will only run after the first completes.

One possible catch is about the external service: If it might take some time after successful upload for the artifact to become available the second job could start before that time and fail. If that’s the case you should wait for the artifact to become available at the end of the first job (or as an in-between job, if you prefer).

1 Like

No, it does not work. The external URL’s of the artifacts become available only when the workflow terminates. So, even if all upload-artifact are completed, the final job in the same workflow will trigger a REST API to a server, this server will use GitHub API to query the artifacts of the latest run of the workflow (still in progress) and will find nothing, precisely because the workflow is still in progress. Waiting as long as you want in the last job won’t change anything, this will just delay the availability of the external URL of the artifact by the same amount of time.

This is precisely my problem: GitHub Actions publishes the external URL of the artifacts only after completion of the complete workflow, all jobs included. What I need is a deterministic way to trigger another workflow after the availability of the external URL of the artifacts.

Having a way to synchronize another workflow after the first one could be a solution (if there is a way to do so) but any other solution would be fine.

1 Like

So you’re not actually uploading/publishing the artifact to the external server, but you want to make the external server download the artifact? :thinking:

Maybe configuring a webhook to the server would work?

1 Like

Yes, this is why I wrote “sends a trigger to a server which pulls the artifacts” in the initial problem description. For security reasons, the external server will not accept incoming binaries. The server knows where to look for the binaries it expects (the artifacts of a specific workflow of a specific repo on GitHub but nothing else). It just needs to be notified of their availability.

Thanks for the idea. But the documentation is not very clear. The webhook workflow_run seems interesting but this is really vague.

It is amazing that the system proposes external URL’s for artifacts and no way to synchronize the use of those URL’s.


It is not possible to set a workflow_run webhook in the GitHub UI. In the settings / webhook page, option “Let me select individual events”, there is no workflow_run event.

However, I found that a workflow can be triggered on a workflow_run condition.

Main workflow:

name: Nightly build
    - cron:  '40 0 * * *'

Second workflow:

name: Nightly build update
      - Nightly build
      - completed

The two workflow ran sequentially last night without problems (delayed by several hours again but serialized this time).

I simply hope that the availability of the external URL’s of the artifacts are not “too asynchronous”, meaning sometimes not fully available yet when the second workflow starts. Time wil tell…


Well, time has told… It does not work in a reliable fashion. In some runs, the artifacts are not found. Most probably, the asynchronous publication of the artifacts public URL has not completed yet. I will try with a several minutes wait in the second job. Adding a sleep is probably the most horrible hack in computing: it does not work, you don’t know why, you have no solution, you just wait and pray for the problem to happen less frequently…