I have what I feel is a fairly simple situation where I would like to cache my Python dependencies within my
venv/. I might look into
actions/cache@v2 for the
~/.cache/pip dependencies, but for now, all I care about is caching everything in
--- name: deploy on: push: branches: - master pull_request: branches: - master workflow_dispatch: jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-python@v2 with: python-version: 3.9.5 - run: pip install virtualenv - uses: actions/download-artifact@v2 with: name: pip - run: virtualenv venv/ - run: source venv/bin/activate - run: pip install -r requirements.txt - uses: actions/upload-artifact@v2 with: name: pip path: venv/ - run: ansible --version - run: ansible-galaxy install -f -p .ansible/galaxy-roles -r requirements.yml - run: ansible-playbook -u ansible -C playbooks/configure.yml env: ANSIBLE_FORCE_COLOR: "true"
In a nutshell, I want to download the
venv folder if it exists, run a
pip install -r requirements.txt to install/update dependencies if necessary, store
venv folder as an artifact, and then continue on to the rest of my work.
I’ll eventually be splitting this up into its own job, maybe I’ll call it
prepare and it will run before everything else so it can be decoupled, but that’s not urgent right now.
When I run
actions/download-artifact@v2, I get an error that the artifact does not exist yet:
Error: Unable to find any artifacts for the associated workflow
It seems rather arbitrary to have to run my workflow once with
actions/download-artifact@v2 commented out and then run it again with it uncommented to get the cache populated once, and if this build does not run for 90 days, it will presumably fail as the artifacts expire.
I am new to GitHub actions: what is the design pattern for doing this? I feel like I’m trying to do something fairly simple but running into a roadblock, and there’s no
if-no-files-found directive on
How do I try to restore an artifact and ignore errors if it does not already exist?