Code Monkey home page Code Monkey logo

download-artifact's People

Contributors

bethanyj28 avatar bflad avatar brcrista avatar bryanmacfarlane avatar dependabot[bot] avatar eggyhead avatar ethomson avatar joshmgross avatar jtamsut avatar jweissig avatar konradpabjan avatar laurencega avatar rentziass avatar rneatherway avatar robherley avatar robpc avatar stchr avatar stonecypher avatar thboop avatar tingluohuang avatar vmjoseph avatar yacaovsnc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

download-artifact's Issues

download artifact matching pattern

Currently it is only possible to download all artifacts or one artifacts.

Need to add possibility to download several artifacts whose name matches given pattern.

Backward compatibility with v1.0.0 - artifact extracted to the current working dir and folder name is not preserved.

I used the artifacts with my Firebase deployment project, that expects the dist folder configured in the public section of firebase to contain all the artifacts.

With the following snippet:

name: Deploy Firebase
on:
  push:
    branches:
      - master

jobs:
  build:
    name: Build
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Repo
        uses: actions/checkout@master
      - name: Install Dependencies
        run: npm install
      - name: Build
        run: npm run build
      - name: Archive Production Artifact
        uses: actions/upload-artifact@v2
        with:
          name: dist
          path: dist
  deploy:
    name: Deploy
    needs: build
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Repo
        uses: actions/checkout@master
      - name: Download Artifact
        uses: actions/download-artifact@v2
        with:
          name: dist
      - name: Deploy to Firebase
        uses: w9jds/firebase-action@master
        with:
          args: deploy --only hosting:stage
        env:
          FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }}
          PROJECT_NAME: name

it doesn't work - looks like the dist archive either isn't unpacked to the dist folder when downloaded. Instead it is unpacked into the current working directory.

Rolling back to v1.0.0 fixes the problem.

Feature request: Wait/block until an artifact is actually available

The idea is that this would introduce better options for parallelization inside a workflow.

If I understand right, currently only jobs can need (wait) for other jobs to finish.

If job B expects an artifact from job A, it must wait for job A to finish completely.

If job B was able to wait (block) for some artifact from job A, it would allow us to run both jobs at the same time and do their stuff (for example build Docker images and prepare other stuff, which can take quite a lot of time) simultaneously. Then, job B could simply wait until job A uploads expected artifact - and then job B could continue, being sure that it has everything it needs.

Obviously some kind of timeout configuration should be possible (mandatory?), after which the step would fail, if the expected artifact never appeared.

I'm just wondering if such idea makes sense to anyone else, too? 🤔 Or is there already some way how to do this kind of thing?

path is required

inputs: 
  name:
    description: 'Artifact name'
    required: true
  path:
    description: 'Destination path' 

This indicates that name is required but path is not. But in my action:

Run actions/upload-artifact@v1
##[error]Value cannot be null.
Parameter name: path
##[error]Exit code 1 returned from process: file name '/home/runner/runners/2.158.0/bin/Runner.PluginHost', arguments 'action "GitHub.Runner.Plugins.Artifact.PublishArtifact, Runner.Plugins"'.

Document permissions loss on linux

I was trying to pass a compiled executable from one job to another only to find it was no longer executable. I understand that under the hood zip files are used, but it would have been nice to have some sort of warning to say that you will loose all the permissions once they go through the upload/download cycle.

Bug: Can't Use this Action due to "event.js" exception thrown

Hi,

I am getting the following error everytime I try to download an artifact that was previously uploaded. The artifact meant to be downloaded is a directory filled with contents.

Run actions/download-artifact@v2
  with:
    name: je-build
Starting download for je-build
Directory structure has been setup for the artifact
Total number of files that will be downloaded: 40
events.js:187
      throw er; // Unhandled 'error' event
      ^

Error: EISDIR: illegal operation on a directory, open '/home/runner/work/[repo name]/[repo name]/[folder in repo]'
Emitted 'error' event on WriteStream instance at:
    at internal/fs/streams.js:294:12
    at FSReqCallback.oncomplete (fs.js:146:23) {
  errno: -21,
  code: 'EISDIR',
  syscall: 'open',
  path: '/home/runner/work/[repo name]/[repo name]/[folder in repo]'

I've ommited the names of my repo. I tried looking for the event.js file in the repo for actions/download-artifact@v2 however I couldn't find it. Does anyone know how to fix this? It appears to be some backend issue.

Kind Regards,

Rowan

Feature request: Add a parameter to allow deletion of the downloaded artifact

Since one of the use cases for uploading/downloading an artifact is persisting workflow data as described in the documentation, it would be great to have a parameter that allows deletion of the downloaded artifact. This is possible via the api as seen in the documentation.
Otherwise it is easy to quickly run out of Storage for Actions and Packages.
Having an additional action just to clean up seems like unnecessary overhead.

Rename default branch

👋 This issue is to track the move over to using main as the default branch for this repo. We’d love your team's help in completing this transition.

Do not remove your old default branch, customers are going to be using it. We will be sending messages out about these changes, but if you want to message in your repository, that's fine as well.

  • Create a main branch.
  • You might need to rebase any pull requests you want to merge before changing the default branch.
  • Change the default branch in settings to main.
  • Update any documentation in this repo to refer to the new branch name, although using the version tag is still preferred.
  • Check that this Action works correctly for users who have a repository with a custom default branch name.
  • Close this issue and celebrate 🎉

We are aiming to complete this work by July 17th July 24th.

The downloaded artifact should have the given ‘name’, not the original ‘path’ from actions/upload-artifact

Here’s a workflow that demonstrates what I’m talking about: https://github.com/leafac/github-actions-download-artifact--name-issue/actions/runs/636138856

There’s a producer job that includes the following:

steps:
  - run: |
      echo TEST > file.txt
      tar -czf file.tgz file.txt
  - uses: actions/upload-artifact@v2
    with:
      name: a-different-name.tgz
      path: file.tgz

Then, a consumer job does the following:

steps:
  - uses: actions/download-artifact@v2
    with:
      name: a-different-name.tgz
  - run: |
      find .
      tar -xzf a-different-name.tgz
      find .

I expected the downloaded artifact to appear as a-different-name.tgz, which is the name I passed to actions/download-artifact, and is also the name that appears on the Artifacts page. Yet, surprisingly, the downloaded artifact appears as file.tgz.

I tried giving the consumer an explicit path, like the following:

steps:
  - uses: actions/download-artifact@v2
    with:
      name: a-different-name.tgz
      path: a-different-name.tgz
  - run: |
      find .
      tar -xzf a-different-name.tgz
      find .

But this doesn’t work either, I just end up with a file at a-different-name.tgz/file.tgz.

The obvious workaround I’m using for now is to make sure the name and the path are always the same in actions/upload-artifact.

Download artifact fails randomly

For context, we have a workflow for which we need a unique identifier across multiple jobs and so we're generating that, storing it into a file and uploading it via actions/upload-artifact. We then use actions/download-artifact to use that identifier in those jobs.

This works fine for the most part, but maybe around 1 in 100 runs this fails during the download with an error that the artifact can't be found. When re-running the workflow it has always passed. This was an issue with v1 of both upload and download actions and after upgrading to v2 this is still an issue.

Here is a log from the upload:

2020-05-28T20:47:40.1234231Z Download action repository 'actions/upload-artifact@v2'
2020-05-28T20:47:41.1871363Z ##[group]Run expr 118336741_$(date +%s) > build-id.txt
2020-05-28T20:47:41.1871609Z �[36;1mexpr 118336741_$(date +%s) > build-id.txt�[0m
2020-05-28T20:47:41.1909153Z shell: /bin/bash -e {0}
2020-05-28T20:47:41.1909398Z ##[endgroup]
2020-05-28T20:47:41.2076591Z ##[group]Run actions/upload-artifact@v2
2020-05-28T20:47:41.2076774Z with:
2020-05-28T20:47:41.2076925Z   name: build-id
2020-05-28T20:47:41.2077060Z   path: build-id.txt
2020-05-28T20:47:41.2077202Z ##[endgroup]
2020-05-28T20:47:42.6328016Z With the provided path, there will be 1 files uploaded
2020-05-28T20:47:42.6328921Z Total size of all the files uploaded is 21 bytes
2020-05-28T20:47:42.6330481Z Finished uploading artifact build-id. Reported size is 21 bytes. There were 0 items that failed to upload
2020-05-28T20:47:42.6330828Z Artifact build-id has been successfully uploaded!

And here is a log from the download:

2020-05-28T20:48:00.2206703Z ##[group]Run actions/download-artifact@v2
2020-05-28T20:48:00.2206993Z with:
2020-05-28T20:48:00.2207134Z   name: build-id
2020-05-28T20:48:00.2207271Z ##[endgroup]
2020-05-28T20:48:00.6146654Z ##[error]Unable to find any artifacts for the associated workflow

Here is a simplified version of the workflow.

on:
  push:

jobs:
  build_id:
    runs-on: ubuntu-latest
    steps:
      - name: Generate build id
        run: expr ${{ github.run_id }}_$(date +%s) > build-id.txt
      - name: Upload build-id
        uses: actions/upload-artifact@v2
        with:
          name: build-id
          path: build-id.txt

  test:
    runs-on: ubuntu-latest

    strategy:
      matrix:
        containers: [1, 2, 3, 4]

    steps:
      - name: Download build-id
        uses: actions/download-artifact@v2
        with:
          name: build-id
          name: build-id # this was added only after upgrading to v2 of download-artifact

download artifact not working in docker image

Hi, I am attempting to use the download/upload actions to share data between 2 jobs in a workflow. Both jobs are triggered based on a user-entered pull-request comment. The first job uploads 2 files (successfully), but the second job fails to download the 2 files.

This workflow is running in a docker image using the Github runners.

image

Here is the failed run output:

image

I've been iterating on this with no luck. Here is the workflow file:

  kops-apply-changes:
    name: Kops Apply Changes
    runs-on: ubuntu-latest
    if: github.event.comment.body == '/kops-apply-updates'
    
    steps:
      - uses: actions/checkout@v2
      - name: Apply Kops Changes
        id: kops_changes_update_apply
        uses: ./github-actions/kops-update-changes-action
        env:
          GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
        with: 
          githubToken: ${{ secrets.GITHUB_TOKEN }}
          kops_update_action: 'update-apply'
      - run: echo ${{ steps.kops_changes_update_apply.outputs.cluster_name }} > kops_cluster.out
      - run: echo ${{ steps.kops_changes_update_apply.outputs.kops_state }} > kops_state.out
      - uses: actions/upload-artifact@v1
        with:
          name: cluster_name
          path: kops_cluster.out
      - uses: actions/upload-artifact@v1
        with:
          name: kops_state
          path: kops_state.out

  rolling-update:
    name: Kops Apply Rolling Updates
    runs-on: ubuntu-latest
    if: github.event.comment.body == '/kops-apply-rolling-updates'
    
    steps:
      - uses: actions/checkout@v2
      - uses: actions/download-artifact@v1
        with:
          name: cluster_name
          path: ${{ github.workspace }}/kops_cluster.out
      - uses: actions/download-artifact@v1
        with:
          name: kops_state
          path: ${{ github.workspace }}/kops_state.out
      - shell: bash
        run: |
          cluster_name_value = `cat ${{ github.workspace }}/kops_cluster.out`
          kops_state_value = `cat ${{ github.workspace }}/kops_state.out`
        env:
          KOPS_CLUSTER: cluster_name_value
          KOPS_STATE: kops_state_value
      - name: Apply Kops Rolling Update
        id: kops_changes_update_apply
        uses: ./github-actions/kops-update-changes-action
        env:
          GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
        with: 
          githubToken: ${{ secrets.GITHUB_TOKEN }}
          kops_update_action: 'rolling-update-apply'
          cluster_name: cluster_name_value
          kops_state: kops_state_value```

Any ideas/thoughts welcome

Add support for tilde expansion

The upload-artifact action supports tilde expansion, i.e. it substitutes ~ with $HOME. The download-artificat action, however, does not seem to support it. This is a bit surprising because I would expect that using the same path for upload and download would restore the file to the same location.

path example bug

I was having trouble understanding the path option and suspecting the example was incorrect. So I tried runnin the example code from upload-artifact and download-artifact in a workflow. The example for the path option returns with an error:

cat: path/to/artifact: Is a directory

Here is the example from the readme:

steps:
- uses: actions/checkout@v1

- uses: actions/download-artifact@v1
  with:
    name: my-artifact
    path: path/to/artifact
    
- run: cat path/to/artifact

Assuming the example is using the artifact from the upload-artifact example (should probably be stated explicitly in the readme), I think the run step should be:

- run: cat path/to/artifact/world.txt

This bug caused a lot of confusion for me, since I assumed the downloaded artifact would be stored in the folder path/to and the file in the artifact named artifact instead of the original filename.

Fails to download artifact that hasn't yet been uploaded

My workflow has two jobs that run in parallel. Job A, the artifact uploader, takes about ~40 seconds before it gets to the actions/upload-artifact step. Job B, the artifact downloader, takes about ~20 seconds before it gets to the actions/download-artifact step. It fails immediately with this error:

Downloading artifact 'frontend' to: '/home/runner/work/sshst/sshst/pubsite/build'
##[error]An Artifact with name "frontend" was not found.
##[error]Exit code 1 returned from process: file name '/home/runner/runners/2.163.1/bin/Runner.PluginHost', arguments 'action "GitHub.Runner.Plugins.Artifact.DownloadArtifact, Runner.Plugins"'.

If I put a step with sleep 60 before the download step, it works great! My initial assumption was that the actions/download-artifact action would wait until the corresponding actions/upload-artifact step succeeds/fails before attempting to download.

Is this behaving as intended? Do you see value in adding that functionality? Alternatively, is there a better workaround than sleep?

Thanks!

Permission denied error for downloaded artifact

Hello! I have this repo with the following ci.yml:

name: CI

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

  workflow_dispatch:

jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v2

      - name: Mono Build
        run: |
          chmod +x build.sh && ./build.sh
        
      - name: Upload Artifacts
        uses: actions/upload-artifact@v2
        with:
          name: library
          path: bin/autodeploytonugettest.dll
          retention-days: 1
  
  lint:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v2

      - name: Markdown Check
        run: |
          sudo chown -R $(whoami) /usr/local/bin /usr/local/lib /usr/local/include /usr/local/share
          npm install -g markdownlint-cli
          markdownlint *.md
      
      - name: Shell Check
        if: always()
        run: |
          sudo apt-get install shellcheck
          shellcheck *.sh
      - name: Download Artifacts
        uses: actions/download-artifact@v2
        if: always()
        with:
          name: library
          path: bin/autodeploytonugettest.dll
      
      - name: CSharp Check
        if: always()
        run: |
          sudo apt update
          sudo apt-get install gendarme
          chmod +r bin/
          cd bin/
          chmod +r autodeploytonugettest.dll
          gendarme -- autodeploytonugettest.dll

Everything works well except the last command gendarme -- autodeploytonugettest.dll which fails with:

An uncaught exception occured. Please fill a bug report at https://bugzilla.novell.com/
Stack trace: System.UnauthorizedAccessException: Access to the path '/home/runner/work/CSharp---Exercise---Other---Auto-deploy-to-NuGet/CSharp---Exercise---Other---Auto-deploy-to-NuGet/bin/autodeploytonugettest.dll' is denied.
  at System.IO.FileStream..ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share, System.Int32 bufferSize, System.Boolean anonymous, System.IO.FileOptions options) [0x000e0] in <533173d24dae460899d2b10975534bb0>:0 
  at System.IO.FileStream..ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <533173d24dae460899d2b10975534bb0>:0 
  at (wrapper remoting-invoke-with-check) System.IO.FileStream..ctor(string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
  at Mono.Cecil.ModuleDefinition.GetFileStream (System.String fileName, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x0001c] in <c2b12ab5b3544f9088836118a10c93c9>:0 
  at Mono.Cecil.ModuleDefinition.ReadModule (System.String fileName, Mono.Cecil.ReaderParameters parameters) [0x00000] in <c2b12ab5b3544f9088836118a10c93c9>:0 
  at Mono.Cecil.AssemblyDefinition.ReadAssembly (System.String fileName, Mono.Cecil.ReaderParameters parameters) [0x00000] in <c2b12ab5b3544f9088836118a10c93c9>:0 
  at Gendarme.ConsoleRunner.AddAssembly (System.String filename) [0x0001b] in <e8e05298cccb4938b5f01696042669b0>:0 
  at Gendarme.ConsoleRunner.AddFiles (System.String name) [0x000cf] in <e8e05298cccb4938b5f01696042669b0>:0 
  at Gendarme.ConsoleRunner.Execute (System.String[] args) [0x000f7] in <e8e05298cccb4938b5f01696042669b0>:0 
Error: Process completed with exit code 4.

Why it happens even when I use sudo command to run gendarme? How to correctly setup permissions here?

An Internal Error Occurred.

Hi,

I asked Github developer support about caching files between builds and they suggested using artifacts. Uploading the cache worked like a charm, but trying to download them again results in an error that doesn't contain a lot of information about what went wrong.

Run actions/download-artifact@master
  with:
    name: gradle-cache
Download artifact 'gradle-cache' to: '/home/runner/work/xxx/xxx/gradle-cache'
##[error]TF400898: An Internal Error Occurred.
##[error]Exit code 1 returned from process: file name '/home/runner/runners/2.157.0/bin/Runner.PluginHost', arguments 'action "GitHub.Runner.Plugins.Artifact.DownloadArtifact, Runner.Plugins"'.

Put all artifacts back.

I use upload-artifact in several jobs to upload various files. Example:

...
- name: uploadapp
  uses: actions/upload-artifact@v2
  with:
    name: app
    path: js/app.js
...

In the final «package» job (that depends on previous jobs as appropriate) I need all these files to be put in the same places they were taken from. So. continuing with the example, I want app.js to be found under js and not, for example, under app/js (as would happen if I use download-artifact without arguments). In other words, the result of running the upload-artifact and download-artifact actions across all jobs should be the same as if only a single job was run, with all steps from all jobs performed in sequence but upload-artifact and download-artifact omitted.

I have many artifacts and they reside in different places. Creating many steps invoking download-artifact with paths wired in by hand is tedious and will lead to errors — at worst, some artifacts may even be forgotten altogether. I would like to use the option of downloading all artifacts in one step, but that also places them back from whence they were taken from.

How can I achieve this? Possibly a feature or some explanatory documentation can be added?

s

#Thank you 🙇‍♀ for wanting to create an issue in this repository. Before you do, please ensure you are filing the issue in the right place. Issues should only be opened on if the issue relates to code in this repository.

If your issue is relevant to this repository, please delete this text and continue to create this issue. Thank you in advance. http://github.com - automatic!
GitHub

Download from a different workflow

How do you download an artifact from another workflow?

A little bit of background. We have a build workflow that builds an artifact that we publish. Then we have a separate workflow that is triggered on deployment that needs to access this artifact. I tried to just use the same name for the artifact in download-artifact and upload-artifact hoping that it would download the files, that doesn't seem to work.

Z_BUF_ERROR

Hi! There is an error after upgrade to actions-runner-linux-arm-2.272.0 at actions/download-artifact@v2 stage:

Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
events.js:187
      throw er; // Unhandled 'error' event
      ^

Error: unexpected end of file
    at Zlib.zlibOnError [as onerror] (zlib.js:170:17)
Emitted 'error' event on Gunzip instance at:
    at Zlib.zlibOnError [as onerror] (zlib.js:173:8) {
  errno: -5,
  code: 'Z_BUF_ERROR'
}

Permission denied when downloading artifact

We get the following permission denied error when trying to download the artifact from a previous job:

Run actions/download-artifact@v2
  with:
    name: saucectlbin
/usr/bin/docker exec  b3253aa2b26618a0d6b4b42a2899d7db08193dcc7a46c30efcd8dc18ad1f351f sh -c "cat /etc/*release | grep ^ID"
Starting download for saucectlbin
Directory structure has been setup for the artifact
Total number of files that will be downloaded: 1
events.js:187
      throw er; // Unhandled 'error' event
      ^

Error: EACCES: permission denied, open '/__w/saucectl/saucectl/saucectl'
Emitted 'error' event on WriteStream instance at:
    at internal/fs/streams.js:294:12
    at FSReqCallback.oncomplete (fs.js:146:23) {
  errno: -13,
  code: 'EACCES',
  syscall: 'open',
  path: '/__w/saucectl/saucectl/saucectl'
}

Here's the job definition:

  puppeteer:
    needs: basic
    runs-on: ubuntu-latest
    container:
      image: saucelabs/stt-puppeteer-jest-node:v0.2.0

    steps:
      # appears that checkout@v2 uses javascript which is not compatible
      # with the included node version in the container image.
      - name: Checkout Code
        uses: actions/checkout@v1

      - name: Download saucectl binary
        uses: actions/download-artifact@v2
        with:
          name: saucectlbin

      - name: Workaround for container permissions
        run: |
          sudo chown -R $USER:$(id -gn $USER) /github/home

      - name: Run Sauce Pipeline Test
        run: |
          ./saucectl run -c ./.sauce/puppeteer.yml
        env:
          BUILD_ID: ${{ github.run_id }}

Link to pipeline: https://github.com/saucelabs/saucectl/runs/1004528103?check_suite_focus=true

Is there a workaround available? Is our setup wrong somewhere?

Limitations on downloading artifacts?

What is the restriction on downloading artifacts -- are they accessible across builds within a repository, or with a branch, or within a particular workflow (single YAML file) or within a single job? What is the rationale for limiting the context in which an artifact can be obtained? I saw mention of this, but not a really clear explanation.

Specifically, Note: You can only download artifacts in a workflow that were uploaded during the same workflow run. is mentioned in the documentation page Storing workflow data as artifacts. What does it mean "during the same workflow run"?

My use-case is that I have a heavy dependency code that I need to compile before I can build my main code. That dependency code (specifically, DAKOTA, https://dakota.sandia.gov) is not hosted on Github, and does not provide convenient binaries for the platform I am interested in (specifically, Ubuntu). I thought of building DAKOTA using a separate manually-triggered workflow, but that is not a supported use-case.

I will have to examine other approaches, I guess.

Is it possible to not fail the workflow if the artifact doesn't exist?

We've been having intermittent issues uploading our coverage reports from our GitHub action. These issues are displayed as 403s which cause really, really long jobs. In order for this to not block steps that come after test coverage, we are moving the coverage upload to a final step after all other.

The issue we have is that coverage is not always generated. In those cases, download artifact fails that final step and the entire workflow is marked as failed.

Screen Shot 2020-07-09 at 9 31 03 AM

Our flow:

a) check if any packages have changes since last run
b) if changed, run tests
c) upload coverage (if no test are run, no coverage exists but the uploader does gracefully exit)
d) all other steps
e) attempt to upload coverage (FAIL if no coverage is found)

Is there a way to gracefully fail if the artifact hasn't been uploaded since in some cases this is not a failure from our standpoint?


Ninja edit: I did see the option continue-on-error, but the overall workflow is still shown with the red 'x' so at a glance it appears our entire release is failing to consumers.

Try out v2 Preview

The v2-preview of download-artifact is out and we need your help!

You can try it out by using actions/download-artifact@v2-preview
Any associated code and documentation can be found here: https://github.com/actions/download-artifact/tree/v2-preview

This issue is for general feedback and to report an bugs/issues during the preview

There is also a v2-preview of upload-artifact, see: actions/upload-artifact#62

Warning: At any time during the preview, there may be unannounced changes that can cause things to break. It is recommended not to use the preview of this action in critical workflows


Historical Context around v2-preview and v1

The v1 versions of download-artifact (and upload-artifact ) are plugins that are executed by the runner. The code for v1 can be found here: https://github.com/actions/runner/tree/master/src/Runner.Plugins/Artifact

The v1 code is written in C# and is tightly coupled to the runner, it also uses special APIs that only the runner can use to interact with artifacts. If any changes or updates had to be made related to artifacts, they had to done on the runner and a new release had to roll out that would take a significant amount of time. With v2, there is no dependency on the runner so it will be much easier and faster to make changes and accept community contributions (right now it was pretty much impossible).

The v2-preview of download-artifact has been rewritten from scratch using Typescript with a new set of APIs that allow it interact with artifacts (previously only the runner could do this). There is a new NPM package called @actions/artifact that contains the core functionality to interact with artifacts (which this action uses for the most part). This NPM package is hosted in the actions/toolkit repo so anyone can use this to interact with artifacts when developing actions. You can find the package here (lots of documentation and extra info):
https://www.npmjs.com/package/@actions/artifact
https://github.com/actions/toolkit/tree/master/packages/artifact

Further work

Since v2-preview is effectively a total rewrite from v1, There is a huge potential for bugs so it needs to be tested thoroughly before creating an actual v2 release. We need help testing the core functionality which includes:

  • Downloading a single artifact
  • Downloading multiple artifacts at once

There will be no new features added as part of the v2-preview, we need to test the core functionality first and once a v2 release is out, then we can start chipping away at issues/features that we have been unable to address with v1🙂 For the moment, please don't submit PRs for any new features to the v2-preview branch.

Download artifact from another repository

How do we download an artifact from another repository?
Or even another branch? Say I'm on a branch and would like to pull an artifact from main to compare/test against

In gitlab:

needs:
  - project: ext_project\proj_name
    job: build
    ref: master
    artifacts: true

In Github, only option seems to be from existing workflow.

- uses: actions/download-artifact@v2
  with:
    name: my-artifact
    path: path/to/artifact

I see this API: https://docs.github.com/en/rest/reference/actions#list-artifacts-for-a-repository
But I'm not sure if it's possible to list by git ref? (master, main, branch name etc...)
Just shows all artifacts?

Don't use actions/download-artifact@master in your workflow

During the beta of GitHub Actions, a lot of documentation (including the README for this action) showed example YAML with master being referenced:

uses: actions/download-artifact@master

We currently encourage users to not use master as a reference whenever an action is being used. Instead, one of the available tags should be used such as:

uses: actions/download-artifact@v1

Tags are used alongside semantic versioning to provide a stable experience: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-actions#versioning-your-action

The master branch can abruptly change without notice during development which can cause workflows to unexpectedly fail. If you want to avoid unexpected changes or failures, you should use a tag when referring to a specific version of an action. Tags are added to stable versions that have undergone significant testing and should not change.

The v2 versions of download-artifact and upload-artifact are currently in development. Expect changes to start showing up in a releases/v2-beta branch around mid-late January. These new changes will eventually be merged into master (we will communicate about this in the future) and that will have the potential to break your workflows if you are using @master so you will have to react by updating your YAML to use a tag.

Our telemetry indicates a significant amount of users are using @master. Good practice for all actions is to use a tag instead of referencing @master.

Z_BUF_ERROR on large Docker images after two minutes of being stalled

Using Github Actions to build Docker images. Until now it was just alpine images but I also want to create debian based images as well. Those images are significantly bigger and as such it takes longer to upload them as an artifact. These images are needing 4GB in size and at this point have a 100% fail rate.

Link to the logs: https://github.com/WyriHaximusNet/docker-php/runs/973243081?check_suite_focus=true#step:4:217
Raw logs:

2020-08-11T21:39:57.0777951Z shell: /bin/bash -e {0}
2020-08-11T21:39:57.0778133Z env:
2020-08-11T21:39:57.0778273Z   DOCKER_IMAGE: wyrihaximusnet/php
2020-08-11T21:39:57.0778401Z   DOCKER_BUILDKIT: 1
2020-08-11T21:39:57.0778533Z ##[endgroup]
2020-08-11T21:39:57.1053333Z   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
2020-08-11T21:39:57.1055891Z                                  Dload  Upload   Total   Spent    Left  Speed
2020-08-11T21:39:57.1058114Z 
2020-08-11T21:39:57.1339680Z   0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
2020-08-11T21:39:57.1342375Z 100   636  100   636    0     0  28909      0 --:--:-- --:--:-- --:--:-- 28909
2020-08-11T21:39:57.2466546Z 
2020-08-11T21:39:57.2471653Z 100 9631k  100 9631k    0     0  66.7M      0 --:--:-- --:--:-- --:--:-- 66.7M
2020-08-11T21:39:57.2644972Z ##[group]Run actions/download-artifact@v2
2020-08-11T21:39:57.2645158Z with:
2020-08-11T21:39:57.2645277Z   name: docker-image-zts-zts-7.2-debian-buster-buster
2020-08-11T21:39:57.2645392Z   path: ./docker-image
2020-08-11T21:39:57.2645629Z env:
2020-08-11T21:39:57.2645739Z   DOCKER_IMAGE: wyrihaximusnet/php
2020-08-11T21:39:57.2646008Z   DOCKER_BUILDKIT: 1
2020-08-11T21:39:57.2646096Z ##[endgroup]
2020-08-11T21:39:57.3162022Z Starting download for docker-image-zts-zts-7.2-debian-buster-buster
2020-08-11T21:39:57.4899696Z Directory structure has been setup for the artifact
2020-08-11T21:39:57.4902004Z Total number of files that will be downloaded: 2
2020-08-11T21:39:58.5764113Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:00.0779400Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:00.4904782Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:01.4906038Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:02.4916714Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:03.4904233Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:04.4903744Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:05.4903670Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:06.4909034Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:07.4922752Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:08.4918993Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:09.4918001Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:10.4913830Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:11.4914718Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:12.4923225Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:13.4920997Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:14.4919687Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:15.4951356Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:16.4935968Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:17.4924684Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:18.4925761Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:19.4925567Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:20.4936529Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:21.4936511Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:22.4941829Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:23.4978986Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:24.4933672Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:25.4933449Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:26.4938388Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:27.4932836Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:28.4935559Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:29.4933681Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:30.4941626Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:31.4932980Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:32.4939034Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:33.4933397Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:34.4933140Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:35.4932911Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:36.4934796Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:37.4932224Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:38.4934869Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:39.4934086Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:40.4932834Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:41.4932245Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:42.4932812Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:43.4933510Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:44.4938049Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:45.4933056Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:46.4941976Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:47.4933578Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:48.4938779Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:49.4933501Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:50.4933156Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:51.4932703Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:52.4934229Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:53.4932286Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:54.4939427Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:55.4938235Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:56.4932843Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:57.4933654Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:58.4932692Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:59.4933883Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:00.4949286Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:01.4948475Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:02.4952259Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:03.4958722Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:04.4953325Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:05.4952838Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:06.4965376Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:07.4978917Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:08.4992608Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:09.5006227Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:10.5011437Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:11.5014127Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:12.5027756Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:13.5041395Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:14.5044670Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:15.5059356Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:16.5073195Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:17.5086940Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:18.5092171Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:19.5104762Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:20.5116841Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:21.5129158Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:22.5141266Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:23.5154293Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:24.5166472Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:25.5179389Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:26.5182995Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:27.5184321Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:28.5196407Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:29.5209433Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:30.5221337Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:31.5234172Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:32.5246761Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:33.5258765Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:34.5271014Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:35.5274753Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:36.5296657Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:37.5299986Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:38.5311864Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:39.5313832Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:40.5327411Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:41.5339088Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:42.5351118Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:43.5354519Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:44.5366958Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:45.5379869Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:46.5392289Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:47.5396590Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:48.5408589Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:49.5421279Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:50.5434512Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:51.5448988Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:53.1085651Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:53.5473106Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:54.5485705Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:55.5498128Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:57.1097414Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:57.5521456Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:58.5533988Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:59.5548795Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:00.5560274Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:01.5572408Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:02.5575362Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:03.5587588Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:04.5600146Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:05.5613336Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:06.5615760Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:07.5628951Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:08.5641893Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:09.5654112Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:10.5666278Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:11.5678714Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:12.5690815Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:13.5702492Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:14.5715101Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:15.5728465Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:16.5740930Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:17.5753927Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:18.5765764Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:19.5778338Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:20.5790948Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:21.5803198Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:22.5815230Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:23.5821178Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:24.5825165Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:25.5837763Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:26.5849585Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:27.5863682Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:28.5866854Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:29.5878551Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:31.1191555Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:31.5894527Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:32.5908115Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:33.5920167Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:34.5932497Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:35.5936618Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:36.5947897Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:37.5959928Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:38.5973610Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:39.5985307Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:40.6000876Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:41.6012347Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:42.6024818Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:43.6037394Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:44.6049813Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:45.6061235Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:46.6064005Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:47.6076484Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:48.6080490Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:49.6092709Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:50.6101175Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:51.6113692Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:52.6126064Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:53.6138787Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:54.6151283Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:55.6164205Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:56.6177591Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:57.6181314Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:58.6194497Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:59.6206510Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:00.6218842Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:01.6232363Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:02.6234261Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:03.6246572Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:04.6251259Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:05.6264740Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:06.6276784Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:07.6289168Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:08.6301933Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:10.1307213Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:10.6316949Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:11.6331857Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:12.6333550Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:13.6336348Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:14.6349482Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:15.3322944Z events.js:187
2020-08-11T21:43:15.3323908Z       throw er; // Unhandled 'error' event
2020-08-11T21:43:15.3324079Z       ^
2020-08-11T21:43:15.3324122Z 
2020-08-11T21:43:15.3324563Z Error: unexpected end of file
2020-08-11T21:43:15.3324845Z     at Zlib.zlibOnError [as onerror] (zlib.js:170:17)
2020-08-11T21:43:15.3325319Z Emitted 'error' event on Gunzip instance at:
2020-08-11T21:43:15.3325605Z     at Zlib.zlibOnError [as onerror] (zlib.js:173:8) {
2020-08-11T21:43:15.3325824Z   errno: -5,
2020-08-11T21:43:15.3326024Z   code: 'Z_BUF_ERROR'
2020-08-11T21:43:15.3326106Z }
2020-08-11T21:43:15.3441578Z ##[group]Run exit 0
2020-08-11T21:43:15.3441749Z �[36;1mexit 0�[0m

download-artifact loses permissions on download

The baseline behavior of the zip utilty on Linux and macOS is to retain permissions.

However, when the download-artifact action zips a directory, it loses permissions. With permissions broken and lost in the resulting asset zipfile, this subsequently breaks the artifacts for users or for downstream automated scripts.

Yes, one option is to write a "github-download-permission-fixer" script that users must run after extracting the assets. However, this could be a lot of work to write and maintain, and hard to justify if it's simply to work-around broken behavior by GitHub. Likewise, if you're dealing with multiple platforms (Windows, macOS, Linux), this script will need to be cross-platform and conform to the lowest-common-denominator available on all platforms (such as bash 3.x on macOS). The alternative is to ask users to install prerequisites (such as bash 4.x) before they can run the github-permission-fixer script.

Expected behavior: permissions, as defined and applied by prior steps in the workflow to the resulting asset files and directories, should not be dropped or ignored by the download-artifact zipper, and should be present in the resulting asset zip file.

Hi @tarcila, the bug is due to the fact that there are two folders both with files different only in their casing:

Hi @tarcila, the bug is due to the fact that there are two folders both with files different only in their casing:
/installed/x64-linux/share/Catch2
/installed/x64-linux/share/catch2

It seems that the download-artifact action is creating only one of the folders thinking the other one is the same. We're looking into how best fix this issue.

Originally posted by @lkillgore in #12 (comment)

spurious 404s during downloads

In one of my workflows, I have one job that makes an artifact and uploads it using actions/upload-artifact, then a dependent job downloads it using actions/download-artifact. Most of the time, this works well enough. However, I've seen multiple spurious 404 errors such as what I've pasted below. Rerunning the workflow typically works.

Starting download for build-artifacts
Directory structure has been setup for the artifact
Total number of files that will be downloaded: 156
##### Begin Diagnostic HTTP information #####
Status Code: 404
Status Message: Not Found
Header Information: {
  "cache-control": "no-store,no-cache",
  "pragma": "no-cache",
  "content-length": "348",
  "content-type": "application/json; charset=utf-8",
  "strict-transport-security": "max-age=2592000",
  "x-tfs-processid": "af0933a7-c1b0-4385-8a55-6aac682d6c86",
  "activityid": "b842aaf9-4b84-4c10-8352-6988ba2c577e",
  "x-tfs-session": "b842aaf9-4b84-4c10-8352-6988ba2c577e",
  "x-vss-e2eid": "b842aaf9-4b84-4c10-8352-6988ba2c577e",
  "x-vss-senderdeploymentid": "13a19993-c6bc-326c-afb4-32c5519f46f0",
  "x-frame-options": "SAMEORIGIN",
  "x-msedge-ref": "Ref A: 24E9DF59C6DF41AEB0B4F4ADD82669E6 Ref B: BLUEDGE0213 Ref C: 2020-10-29T15:55:07Z",
  "date": "Thu, 29 Oct 2020 15:55:07 GMT"
}
###### End Diagnostic HTTP information ######
Error: Unable to download the artifact: Error: Unexpected http 404 during download for https://pipelines.actions.githubusercontent.com/5VR8pVAJWOch570pQRgQ7EctEDmSm2LFZLvggBFEZdahLnzvWU/_apis/resources/Containers/3686237?itemPath=build-artifacts%2Fservice%2Fapi-service.zip

Example failing run: https://github.com/microsoft/onefuzz/pull/229/checks?check_run_id=1327596964

Have flag to create `path`

Support flag to create destination directory.

For comparison, infozip 6.0's unzip -d creates the specified directory.

download multiple files at once ?

could we download multiple folders at once ?

    - name: Download results 
      continue-on-error: true
      uses: actions/download-artifact@v1
      with:
        name: functional_test_docker_ubuntu
        name: barmetal_test_docker_ubuntu

? or do I have to repeat this block if I wanna download 8 folders?

Feature request: retry policy

In my pipelines I see artifact download problems related to network from time to time. These fails are not very frequent, but they do happen.

Here's an example of the problem I see (you can see the logs here:

Starting download for ispc_llvm11_linux
##### Begin Diagnostic HTTP information #####
Status Code: 503
Status Message: Service Unavailable
Header Information: {
  "cache-control": "no-store",
  "content-length": "228",
  "content-type": "text/html",
  "server": "Microsoft-IIS/10.0",
  "x-msedge-ref": "Ref A: F6F0AF70C73C4DA5BA5EF5FA16A46967 Ref B: CO1EDGE0910 Ref C: 2020-12-07T08:15:57Z",
  "date": "Mon, 07 Dec 2020 08:16:15 GMT",
  "connection": "close"
}
###### End Diagnostic HTTP information ######
Error: Unable to get ContainersItems from https://pipelines.actions.githubusercontent.com/vRWS246tQliPVeJkfd7cOfXg6DKlc5d3gGpSXpb3WtbhcjV7Xt/_apis/resources/Containers/8374578?itemPath=ispc_llvm11_linux

The artifact does exist and other jobs are able to download it. The most logical way to fight this kind of failures is to do retry with a timeout. Something like retry 3 times with 15 seconds timeout sounds like a reasonable thing to do in this case.

Could you please support retries?

support for "workflow_run" event triggers

the support for the newly introduced event "worgkflow_run" for github actions seams broken.

see my test environment: https://github.com/jkowalleck/playground_workflow_run

playground_workflow_run

playground for github event "worgkflow_run".

For example, if your pull_request workflow generates build artifacts,
you can create a new workflow that uses workflow_run to analyze the
results and add a comment to the original pull request.

The goal of tis playground is to see if downloading and processing
artifacts in a "worgkflow_run" eent works as expected.

expectation

in a "worgkflow_run" triggered event
i can download artifacts from the workflow that triggered the "worgkflow_run" trigger.

conclusion

the action "download-artifact" semas to be broken on event "worgkflow_run".

obeservations

how i tested

  1. action "TESTS" is triggered on push.
    • it creates an artifact "reports".
    • it downloads that exact sams artifact "reports".
  2. action "TESTS DONE" is triggered
    after the action "TESTS" completed.
    • it downloads that exact sams artifact "reports" from the "TESTS" action.

Feature request: output downloaded paths in log

Similar to #11 but I would like to be able to see the paths to the downloaded files in the log. Perhaps the 'variable' output implemented for #11 would show up in the log and satisfy this request. My issue is regularly apparently getting it wrong as to what to enter in the path parameter and ending up with stuff like below where the file is not found in a subsequent step and I have to either add a tmate action for debug or other debug commands or just start guessing...

image

Feature request: wildcard support (or download-all)

One of the really useful things about uploading artifacts is caching logs, coverage reports, et cetera. Those often have timestamps or UUIDs as names.

You can hack around it with a compression to a known name step, but, uploading and downloading artifacts by wildcard (or even just download-all) would be a huge benefit for post-processing, eg coverage merging

Feature request: an output containing the path to extracted artifacts

Given that the 'path' is optional input (it may come from the upload action), it would be nice if this action provided an output 'path'. This would reduce duplication of data in subsequent steps, which have no knowledge of inputs passed to the step containing the upload action.

Fail to down an artifact previously uploaded by `actions/upload-artifact`

Hi,

I am discovering gh actions, and am running into an issue with artifacts. When trying to download a specific artifact, it fails with the following message:

##[warning]Fail to download 'test/installed/x64-linux/share/Catch2/gdbinit', error: Could not find a part of the path '/home/runner/work/gha-download-bug/gha-download-bug/test/installed/x64-linux/share/Catch2/gdbinit'. (Downloader 1)
##[warning]Back off 25.301 seconds before retry. (Downloader 1)
##[warning]Fail to download 'test/installed/x64-linux/share/Catch2/lldbinit', error: Could not find a part of the path '/home/runner/work/gha-download-bug/gha-download-bug/test/installed/x64-linux/share/Catch2/lldbinit'. (Downloader 0)
##[warning]Back off 22.004 seconds before retry. (Downloader 0)

Not sure what makes this artifact problematic. For a reproducer, see:

Strange percentages and file counts when downloading artifacts

For example:

Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1929 (3858.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1981 (3001.5%)
Full log, click to expand
Run actions/download-artifact@v2
  with:
    path: artifacts
No artifact name specified, downloading all artifacts
Creating an extra directory for each artifact that is being downloaded
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #25 (50.0%)
Total file count: 50 ---- Processed file #51 (102.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #82 (124.2%)
Total file count: 66 ---- Processed file #108 (163.6%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #149 (413.8%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #193 (386.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #243 (368.1%)
Total file count: 66 ---- Processed file #272 (412.1%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #307 (852.7%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #349 (698.0%)
Total file count: 50 ---- Processed file #367 (734.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #401 (607.5%)
Total file count: 66 ---- Processed file #418 (633.3%)
Total file count: 66 ---- Processed file #433 (656.0%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #465 (1291.6%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #505 (1010.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #558 (845.4%)
Total file count: 66 ---- Processed file #583 (883.3%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #623 (1730.5%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #665 (1330.0%)
Total file count: 50 ---- Processed file #687 (1374.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #714 (1081.8%)
Total file count: 66 ---- Processed file #744 (1127.2%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #782 (2172.2%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #824 (1648.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #874 (1324.2%)
Total file count: 66 ---- Processed file #902 (1366.6%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #934 (2594.4%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #981 (1962.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1024 (1551.5%)
Total file count: 66 ---- Processed file #1044 (1581.8%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1095 (3041.6%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1139 (2278.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1184 (1793.9%)
Total file count: 66 ---- Processed file #1212 (1836.3%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1253 (3480.5%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1297 (2594.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1347 (2040.9%)
Total file count: 66 ---- Processed file #1375 (2083.3%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1414 (3927.7%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1456 (2912.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1506 (2281.8%)
Total file count: 66 ---- Processed file #1537 (2328.7%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1573 (4369.4%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1613 (3226.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1664 (2521.2%)
Total file count: 66 ---- Processed file #1694 (2566.6%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1731 (4808.3%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1771 (3542.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1818 (2754.5%)
Total file count: 66 ---- Processed file #1845 (2795.4%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1886 (5238.8%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1929 (3858.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1981 (3001.5%)
Total file count: 66 ---- Processed file #2011 (3046.9%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #2045 (5680.5%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #2087 (4174.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #2139 (3240.9%)
Total file count: 66 ---- Processed file #2168 (3284.8%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #2205 (6125.0%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #2247 (4494.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #2295 (3477.2%)
Total file count: 66 ---- Processed file #2318 (3512.1%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #2363 (6563.8%)
There were 60 artifacts downloaded

Example IRL: https://github.com/TWiStErRob/net.twisterrob.gradle/pull/110/checks?check_run_id=2595895174

Download fails with TF400898

I am playing around with upload and download of artifacts, and have so far not managed to perform a download, though the upload seems to succeed. Its very likely just me using it incorrectly, but error generated is not very clear to say the least:

image

Here is one version of my workflow file causing this (as you might guess from the commented stuff, i have been trying a few different things):

name: ASP.NET Core CI

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    
    steps:
    - uses: actions/checkout@v1
    - name: Setup .NET Core
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version: 2.2.108
    - name: Build with dotnet
      run: dotnet build --configuration Release
    - name: Test with dotnet
      run: dotnet test -c Release --no-build
    - name: Publish with dotnet
      run: dotnet publish -c Release --no-build -o output
    - name: Upload
      uses: actions/upload-artifact@master
      with:
        name: output
        path: output

  process:
    runs-on: windows-latest
    steps:
    - name: Mkdir
      run: mkdir output
    - name: Download
      uses: actions/download-artifact@master
      with:
        name: output
#        path: output

  #   - name: List download
  #     run: dir ./output

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.