Upload artifact dir is very slow

if archive dir to tar first, it will be faster?

suggest: GitHub actions/upload-artifact archive dir automatic, and actions/download-artifact support with uncompress

The answer seems to be no, it will not be faster because it already compresses artifacts:

GZip is used internally to compress individual files before starting an upload.

Source: toolkit/additional-information.md at 634dc61da2332f1062d1fe800ccc7fe51d3c67bf · actions/toolkit · GitHub

But:

The size of the artifact is denoted in bytes. The displayed artifact size denotes the raw uploaded artifact size (the sum of all the individual files uploaded during the workflow run for the artifact), not the compressed size.

Source: GitHub - actions/upload-artifact

The answer is yes.

As an example, imagine an artifact with 1000 files (each 10 Kb in size). Without any modification, there would be around 1000 HTTP calls made to upload the artifact. If you zip or archive the artifact beforehand, the number of HTTP calls can be dropped to single digit territory. Measures like this will significantly speed up your upload

but it need archive manually

  - name: 'Tar files'
    run: tar -cvf my_files.tar /path/to/my/directory

  - name: 'Upload Artifact'
    uses: actions/upload-artifact@v2
    with:
      name: my-artifact
      path: my_files.tar    

Good point, it says “GZip is used internally to compress individual files”, so it doesn’t reduce the number of files but only their size. Compressing all files together not only reduces the number of files but may also improve the compression ratio if they have similar content, as the compression dictionary can be re-used.

Hi,

Could you please describe this process more?

Where would I type this, in which file?

What is the actual path you say?

How would an Azure app service handle this file after upload?

Thanks!