actions / download-artifact Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
Currently it is only possible to download all artifacts or one artifacts.
Need to add possibility to download several artifacts whose name matches given pattern.
I used the artifacts with my Firebase deployment project, that expects the dist
folder configured in the public
section of firebase to contain all the artifacts.
With the following snippet:
name: Deploy Firebase
on:
push:
branches:
- master
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- name: Checkout Repo
uses: actions/checkout@master
- name: Install Dependencies
run: npm install
- name: Build
run: npm run build
- name: Archive Production Artifact
uses: actions/upload-artifact@v2
with:
name: dist
path: dist
deploy:
name: Deploy
needs: build
runs-on: ubuntu-latest
steps:
- name: Checkout Repo
uses: actions/checkout@master
- name: Download Artifact
uses: actions/download-artifact@v2
with:
name: dist
- name: Deploy to Firebase
uses: w9jds/firebase-action@master
with:
args: deploy --only hosting:stage
env:
FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }}
PROJECT_NAME: name
it doesn't work - looks like the dist
archive either isn't unpacked to the dist
folder when downloaded. Instead it is unpacked into the current working directory.
Rolling back to v1.0.0
fixes the problem.
Add cache
to workflows using actions/setup-node
setup-node
GitHub Action just released a new option to add cache to steps using it.
You can find the details here: https://github.blog/changelog/2021-07-02-github-actions-setup-node-now-supports-dependency-caching/
Please document which permissions are required for a workflow to be able to use this actions.
The idea is that this would introduce better options for parallelization inside a workflow.
If I understand right, currently only jobs
can need
(wait) for other jobs
to finish.
If job B expects an artifact from job A, it must wait for job A to finish completely.
If job B was able to wait (block) for some artifact from job A, it would allow us to run both jobs
at the same time and do their stuff (for example build Docker images and prepare other stuff, which can take quite a lot of time) simultaneously. Then, job B could simply wait until job A uploads expected artifact - and then job B could continue, being sure that it has everything it needs.
Obviously some kind of timeout
configuration should be possible (mandatory?), after which the step would fail, if the expected artifact never appeared.
I'm just wondering if such idea makes sense to anyone else, too? 🤔 Or is there already some way how to do this kind of thing?
inputs:
name:
description: 'Artifact name'
required: true
path:
description: 'Destination path'
This indicates that name
is required but path
is not. But in my action:
Run actions/upload-artifact@v1
##[error]Value cannot be null.
Parameter name: path
##[error]Exit code 1 returned from process: file name '/home/runner/runners/2.158.0/bin/Runner.PluginHost', arguments 'action "GitHub.Runner.Plugins.Artifact.PublishArtifact, Runner.Plugins"'.
Is there a way to download the artifact of the latest GitHub Action run for a specific repository?
I need to download them to my local computer. Is there a script that offers such a feature?
I was trying to pass a compiled executable from one job to another only to find it was no longer executable. I understand that under the hood zip files are used, but it would have been nice to have some sort of warning to say that you will loose all the permissions once they go through the upload/download cycle.
Hi,
I am getting the following error everytime I try to download an artifact that was previously uploaded. The artifact meant to be downloaded is a directory filled with contents.
Run actions/download-artifact@v2
with:
name: je-build
Starting download for je-build
Directory structure has been setup for the artifact
Total number of files that will be downloaded: 40
events.js:187
throw er; // Unhandled 'error' event
^
Error: EISDIR: illegal operation on a directory, open '/home/runner/work/[repo name]/[repo name]/[folder in repo]'
Emitted 'error' event on WriteStream instance at:
at internal/fs/streams.js:294:12
at FSReqCallback.oncomplete (fs.js:146:23) {
errno: -21,
code: 'EISDIR',
syscall: 'open',
path: '/home/runner/work/[repo name]/[repo name]/[folder in repo]'
I've ommited the names of my repo. I tried looking for the event.js
file in the repo for actions/download-artifact@v2 however I couldn't find it. Does anyone know how to fix this? It appears to be some backend issue.
Kind Regards,
Rowan
Since one of the use cases for uploading/downloading an artifact is persisting workflow data as described in the documentation, it would be great to have a parameter that allows deletion of the downloaded artifact. This is possible via the api as seen in the documentation.
Otherwise it is easy to quickly run out of Storage for Actions and Packages.
Having an additional action just to clean up seems like unnecessary overhead.
👋 This issue is to track the move over to using main
as the default branch for this repo. We’d love your team's help in completing this transition.
Do not remove your old default branch, customers are going to be using it. We will be sending messages out about these changes, but if you want to message in your repository, that's fine as well.
main
branch.main
.We are aiming to complete this work by July 17th July 24th.
Here’s a workflow that demonstrates what I’m talking about: https://github.com/leafac/github-actions-download-artifact--name-issue/actions/runs/636138856
There’s a producer
job that includes the following:
steps:
- run: |
echo TEST > file.txt
tar -czf file.tgz file.txt
- uses: actions/upload-artifact@v2
with:
name: a-different-name.tgz
path: file.tgz
Then, a consumer
job does the following:
steps:
- uses: actions/download-artifact@v2
with:
name: a-different-name.tgz
- run: |
find .
tar -xzf a-different-name.tgz
find .
I expected the downloaded artifact to appear as a-different-name.tgz
, which is the name I passed to actions/download-artifact
, and is also the name that appears on the Artifacts page. Yet, surprisingly, the downloaded artifact appears as file.tgz
.
I tried giving the consumer
an explicit path
, like the following:
steps:
- uses: actions/download-artifact@v2
with:
name: a-different-name.tgz
path: a-different-name.tgz
- run: |
find .
tar -xzf a-different-name.tgz
find .
But this doesn’t work either, I just end up with a file at a-different-name.tgz/file.tgz
.
The obvious workaround I’m using for now is to make sure the name
and the path
are always the same in actions/upload-artifact
.
For context, we have a workflow for which we need a unique identifier across multiple jobs and so we're generating that, storing it into a file and uploading it via actions/upload-artifact
. We then use actions/download-artifact
to use that identifier in those jobs.
This works fine for the most part, but maybe around 1 in 100 runs this fails during the download with an error that the artifact can't be found. When re-running the workflow it has always passed. This was an issue with v1 of both upload and download actions and after upgrading to v2 this is still an issue.
Here is a log from the upload:
2020-05-28T20:47:40.1234231Z Download action repository 'actions/upload-artifact@v2'
2020-05-28T20:47:41.1871363Z ##[group]Run expr 118336741_$(date +%s) > build-id.txt
2020-05-28T20:47:41.1871609Z �[36;1mexpr 118336741_$(date +%s) > build-id.txt�[0m
2020-05-28T20:47:41.1909153Z shell: /bin/bash -e {0}
2020-05-28T20:47:41.1909398Z ##[endgroup]
2020-05-28T20:47:41.2076591Z ##[group]Run actions/upload-artifact@v2
2020-05-28T20:47:41.2076774Z with:
2020-05-28T20:47:41.2076925Z name: build-id
2020-05-28T20:47:41.2077060Z path: build-id.txt
2020-05-28T20:47:41.2077202Z ##[endgroup]
2020-05-28T20:47:42.6328016Z With the provided path, there will be 1 files uploaded
2020-05-28T20:47:42.6328921Z Total size of all the files uploaded is 21 bytes
2020-05-28T20:47:42.6330481Z Finished uploading artifact build-id. Reported size is 21 bytes. There were 0 items that failed to upload
2020-05-28T20:47:42.6330828Z Artifact build-id has been successfully uploaded!
And here is a log from the download:
2020-05-28T20:48:00.2206703Z ##[group]Run actions/download-artifact@v2
2020-05-28T20:48:00.2206993Z with:
2020-05-28T20:48:00.2207134Z name: build-id
2020-05-28T20:48:00.2207271Z ##[endgroup]
2020-05-28T20:48:00.6146654Z ##[error]Unable to find any artifacts for the associated workflow
Here is a simplified version of the workflow.
on:
push:
jobs:
build_id:
runs-on: ubuntu-latest
steps:
- name: Generate build id
run: expr ${{ github.run_id }}_$(date +%s) > build-id.txt
- name: Upload build-id
uses: actions/upload-artifact@v2
with:
name: build-id
path: build-id.txt
test:
runs-on: ubuntu-latest
strategy:
matrix:
containers: [1, 2, 3, 4]
steps:
- name: Download build-id
uses: actions/download-artifact@v2
with:
name: build-id
name: build-id # this was added only after upgrading to v2 of download-artifact
Hi, I am attempting to use the download/upload actions to share data between 2 jobs in a workflow. Both jobs are triggered based on a user-entered pull-request comment. The first job uploads 2 files (successfully), but the second job fails to download the 2 files.
This workflow is running in a docker image using the Github runners.
Here is the failed run output:
I've been iterating on this with no luck. Here is the workflow file:
kops-apply-changes:
name: Kops Apply Changes
runs-on: ubuntu-latest
if: github.event.comment.body == '/kops-apply-updates'
steps:
- uses: actions/checkout@v2
- name: Apply Kops Changes
id: kops_changes_update_apply
uses: ./github-actions/kops-update-changes-action
env:
GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
kops_update_action: 'update-apply'
- run: echo ${{ steps.kops_changes_update_apply.outputs.cluster_name }} > kops_cluster.out
- run: echo ${{ steps.kops_changes_update_apply.outputs.kops_state }} > kops_state.out
- uses: actions/upload-artifact@v1
with:
name: cluster_name
path: kops_cluster.out
- uses: actions/upload-artifact@v1
with:
name: kops_state
path: kops_state.out
rolling-update:
name: Kops Apply Rolling Updates
runs-on: ubuntu-latest
if: github.event.comment.body == '/kops-apply-rolling-updates'
steps:
- uses: actions/checkout@v2
- uses: actions/download-artifact@v1
with:
name: cluster_name
path: ${{ github.workspace }}/kops_cluster.out
- uses: actions/download-artifact@v1
with:
name: kops_state
path: ${{ github.workspace }}/kops_state.out
- shell: bash
run: |
cluster_name_value = `cat ${{ github.workspace }}/kops_cluster.out`
kops_state_value = `cat ${{ github.workspace }}/kops_state.out`
env:
KOPS_CLUSTER: cluster_name_value
KOPS_STATE: kops_state_value
- name: Apply Kops Rolling Update
id: kops_changes_update_apply
uses: ./github-actions/kops-update-changes-action
env:
GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
kops_update_action: 'rolling-update-apply'
cluster_name: cluster_name_value
kops_state: kops_state_value```
Any ideas/thoughts welcome
The upload-artifact
action supports tilde expansion, i.e. it substitutes ~
with $HOME
. The download-artificat
action, however, does not seem to support it. This is a bit surprising because I would expect that using the same path for upload and download would restore the file to the same location.
I was having trouble understanding the path option and suspecting the example was incorrect. So I tried runnin the example code from upload-artifact and download-artifact in a workflow. The example for the path
option returns with an error:
cat: path/to/artifact: Is a directory
Here is the example from the readme:
steps:
- uses: actions/checkout@v1
- uses: actions/download-artifact@v1
with:
name: my-artifact
path: path/to/artifact
- run: cat path/to/artifact
Assuming the example is using the artifact from the upload-artifact example (should probably be stated explicitly in the readme), I think the run
step should be:
- run: cat path/to/artifact/world.txt
This bug caused a lot of confusion for me, since I assumed the downloaded artifact would be stored in the folder path/to
and the file in the artifact named artifact
instead of the original filename.
My workflow has two jobs that run in parallel. Job A, the artifact uploader, takes about ~40 seconds before it gets to the actions/upload-artifact
step. Job B, the artifact downloader, takes about ~20 seconds before it gets to the actions/download-artifact
step. It fails immediately with this error:
Downloading artifact 'frontend' to: '/home/runner/work/sshst/sshst/pubsite/build'
##[error]An Artifact with name "frontend" was not found.
##[error]Exit code 1 returned from process: file name '/home/runner/runners/2.163.1/bin/Runner.PluginHost', arguments 'action "GitHub.Runner.Plugins.Artifact.DownloadArtifact, Runner.Plugins"'.
If I put a step with sleep 60
before the download step, it works great! My initial assumption was that the actions/download-artifact
action would wait until the corresponding actions/upload-artifact
step succeeds/fails before attempting to download.
Is this behaving as intended? Do you see value in adding that functionality? Alternatively, is there a better workaround than sleep
?
Thanks!
Here's a job that errored out when this action hit HTTP 500: https://github.com/ansible/pylibssh/runs/629834790?check_suite_focus=true#step:5:61
How about implementing a sort of exponential backoff retry strategy to possibly make the download process more robust?
I would like to download artifacts from other repositories, is this possible?
Ideally would like to get the artifacts of the last finished build of a branch in other repository.
Hello! I have this repo with the following ci.yml:
name: CI
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Mono Build
run: |
chmod +x build.sh && ./build.sh
- name: Upload Artifacts
uses: actions/upload-artifact@v2
with:
name: library
path: bin/autodeploytonugettest.dll
retention-days: 1
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Markdown Check
run: |
sudo chown -R $(whoami) /usr/local/bin /usr/local/lib /usr/local/include /usr/local/share
npm install -g markdownlint-cli
markdownlint *.md
- name: Shell Check
if: always()
run: |
sudo apt-get install shellcheck
shellcheck *.sh
- name: Download Artifacts
uses: actions/download-artifact@v2
if: always()
with:
name: library
path: bin/autodeploytonugettest.dll
- name: CSharp Check
if: always()
run: |
sudo apt update
sudo apt-get install gendarme
chmod +r bin/
cd bin/
chmod +r autodeploytonugettest.dll
gendarme -- autodeploytonugettest.dll
Everything works well except the last command gendarme -- autodeploytonugettest.dll
which fails with:
An uncaught exception occured. Please fill a bug report at https://bugzilla.novell.com/
Stack trace: System.UnauthorizedAccessException: Access to the path '/home/runner/work/CSharp---Exercise---Other---Auto-deploy-to-NuGet/CSharp---Exercise---Other---Auto-deploy-to-NuGet/bin/autodeploytonugettest.dll' is denied.
at System.IO.FileStream..ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share, System.Int32 bufferSize, System.Boolean anonymous, System.IO.FileOptions options) [0x000e0] in <533173d24dae460899d2b10975534bb0>:0
at System.IO.FileStream..ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <533173d24dae460899d2b10975534bb0>:0
at (wrapper remoting-invoke-with-check) System.IO.FileStream..ctor(string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
at Mono.Cecil.ModuleDefinition.GetFileStream (System.String fileName, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x0001c] in <c2b12ab5b3544f9088836118a10c93c9>:0
at Mono.Cecil.ModuleDefinition.ReadModule (System.String fileName, Mono.Cecil.ReaderParameters parameters) [0x00000] in <c2b12ab5b3544f9088836118a10c93c9>:0
at Mono.Cecil.AssemblyDefinition.ReadAssembly (System.String fileName, Mono.Cecil.ReaderParameters parameters) [0x00000] in <c2b12ab5b3544f9088836118a10c93c9>:0
at Gendarme.ConsoleRunner.AddAssembly (System.String filename) [0x0001b] in <e8e05298cccb4938b5f01696042669b0>:0
at Gendarme.ConsoleRunner.AddFiles (System.String name) [0x000cf] in <e8e05298cccb4938b5f01696042669b0>:0
at Gendarme.ConsoleRunner.Execute (System.String[] args) [0x000f7] in <e8e05298cccb4938b5f01696042669b0>:0
Error: Process completed with exit code 4.
Why it happens even when I use sudo command to run gendarme
? How to correctly setup permissions here?
Hi,
I asked Github developer support about caching files between builds and they suggested using artifacts. Uploading the cache worked like a charm, but trying to download them again results in an error that doesn't contain a lot of information about what went wrong.
Run actions/download-artifact@master
with:
name: gradle-cache
Download artifact 'gradle-cache' to: '/home/runner/work/xxx/xxx/gradle-cache'
##[error]TF400898: An Internal Error Occurred.
##[error]Exit code 1 returned from process: file name '/home/runner/runners/2.157.0/bin/Runner.PluginHost', arguments 'action "GitHub.Runner.Plugins.Artifact.DownloadArtifact, Runner.Plugins"'.
I use upload-artifact
in several jobs to upload various files. Example:
...
- name: uploadapp
uses: actions/upload-artifact@v2
with:
name: app
path: js/app.js
...
In the final «package» job (that depends on previous jobs as appropriate) I need all these files to be put in the same places they were taken from. So. continuing with the example, I want app.js
to be found under js
and not, for example, under app/js
(as would happen if I use download-artifact
without arguments). In other words, the result of running the upload-artifact
and download-artifact
actions across all jobs should be the same as if only a single job was run, with all steps from all jobs performed in sequence but upload-artifact
and download-artifact
omitted.
I have many artifacts and they reside in different places. Creating many steps invoking download-artifact
with paths wired in by hand is tedious and will lead to errors — at worst, some artifacts may even be forgotten altogether. I would like to use the option of downloading all artifacts in one step, but that also places them back from whence they were taken from.
How can I achieve this? Possibly a feature or some explanatory documentation can be added?
#Thank you 🙇♀ for wanting to create an issue in this repository. Before you do, please ensure you are filing the issue in the right place. Issues should only be opened on if the issue relates to code in this repository.
If your issue is relevant to this repository, please delete this text and continue to create this issue. Thank you in advance. http://github.com - automatic!
GitHub
How do you download an artifact from another workflow?
A little bit of background. We have a build workflow that builds an artifact that we publish. Then we have a separate workflow that is triggered on deployment
that needs to access this artifact. I tried to just use the same name for the artifact in download-artifact
and upload-artifact
hoping that it would download the files, that doesn't seem to work.
Hi! There is an error after upgrade to actions-runner-linux-arm-2.272.0
at actions/download-artifact@v2
stage:
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
Total file count: 2 ---- Processed file #0 (0.0%)
events.js:187
throw er; // Unhandled 'error' event
^
Error: unexpected end of file
at Zlib.zlibOnError [as onerror] (zlib.js:170:17)
Emitted 'error' event on Gunzip instance at:
at Zlib.zlibOnError [as onerror] (zlib.js:173:8) {
errno: -5,
code: 'Z_BUF_ERROR'
}
We get the following permission denied error when trying to download the artifact from a previous job:
Run actions/download-artifact@v2
with:
name: saucectlbin
/usr/bin/docker exec b3253aa2b26618a0d6b4b42a2899d7db08193dcc7a46c30efcd8dc18ad1f351f sh -c "cat /etc/*release | grep ^ID"
Starting download for saucectlbin
Directory structure has been setup for the artifact
Total number of files that will be downloaded: 1
events.js:187
throw er; // Unhandled 'error' event
^
Error: EACCES: permission denied, open '/__w/saucectl/saucectl/saucectl'
Emitted 'error' event on WriteStream instance at:
at internal/fs/streams.js:294:12
at FSReqCallback.oncomplete (fs.js:146:23) {
errno: -13,
code: 'EACCES',
syscall: 'open',
path: '/__w/saucectl/saucectl/saucectl'
}
Here's the job definition:
puppeteer:
needs: basic
runs-on: ubuntu-latest
container:
image: saucelabs/stt-puppeteer-jest-node:v0.2.0
steps:
# appears that checkout@v2 uses javascript which is not compatible
# with the included node version in the container image.
- name: Checkout Code
uses: actions/checkout@v1
- name: Download saucectl binary
uses: actions/download-artifact@v2
with:
name: saucectlbin
- name: Workaround for container permissions
run: |
sudo chown -R $USER:$(id -gn $USER) /github/home
- name: Run Sauce Pipeline Test
run: |
./saucectl run -c ./.sauce/puppeteer.yml
env:
BUILD_ID: ${{ github.run_id }}
Link to pipeline: https://github.com/saucelabs/saucectl/runs/1004528103?check_suite_focus=true
Is there a workaround available? Is our setup wrong somewhere?
What is the restriction on downloading artifacts -- are they accessible across builds within a repository, or with a branch, or within a particular workflow (single YAML file) or within a single job? What is the rationale for limiting the context in which an artifact can be obtained? I saw mention of this, but not a really clear explanation.
Specifically, Note: You can only download artifacts in a workflow that were uploaded during the same workflow run. is mentioned in the documentation page Storing workflow data as artifacts. What does it mean "during the same workflow run"?
My use-case is that I have a heavy dependency code that I need to compile before I can build my main code. That dependency code (specifically, DAKOTA, https://dakota.sandia.gov) is not hosted on Github, and does not provide convenient binaries for the platform I am interested in (specifically, Ubuntu). I thought of building DAKOTA using a separate manually-triggered workflow, but that is not a supported use-case.
I will have to examine other approaches, I guess.
We've been having intermittent issues uploading our coverage reports from our GitHub action. These issues are displayed as 403s which cause really, really long jobs. In order for this to not block steps that come after test coverage, we are moving the coverage upload to a final step after all other.
The issue we have is that coverage is not always generated. In those cases, download artifact fails that final step and the entire workflow is marked as failed.
Our flow:
a) check if any packages have changes since last run
b) if changed, run tests
c) upload coverage (if no test are run, no coverage exists but the uploader does gracefully exit)
d) all other steps
e) attempt to upload coverage (FAIL if no coverage is found)
Is there a way to gracefully fail if the artifact hasn't been uploaded since in some cases this is not a failure from our standpoint?
Ninja edit: I did see the option continue-on-error
, but the overall workflow is still shown with the red 'x' so at a glance it appears our entire release is failing to consumers.
The v2-preview of download-artifact is out and we need your help!
You can try it out by using actions/download-artifact@v2-preview
Any associated code and documentation can be found here: https://github.com/actions/download-artifact/tree/v2-preview
This issue is for general feedback and to report an bugs/issues during the preview
There is also a v2-preview
of upload-artifact
, see: actions/upload-artifact#62
Warning: At any time during the preview, there may be unannounced changes that can cause things to break. It is recommended not to use the preview of this action in critical workflows
The v1
versions of download-artifact
(and upload-artifact
) are plugins that are executed by the runner. The code for v1
can be found here: https://github.com/actions/runner/tree/master/src/Runner.Plugins/Artifact
The v1
code is written in C# and is tightly coupled to the runner, it also uses special APIs that only the runner can use to interact with artifacts. If any changes or updates had to be made related to artifacts, they had to done on the runner and a new release had to roll out that would take a significant amount of time. With v2, there is no dependency on the runner so it will be much easier and faster to make changes and accept community contributions (right now it was pretty much impossible).
The v2-preview
of download-artifact has been rewritten from scratch using Typescript with a new set of APIs that allow it interact with artifacts (previously only the runner could do this). There is a new NPM package called @actions/artifact
that contains the core functionality to interact with artifacts (which this action uses for the most part). This NPM package is hosted in the actions/toolkit
repo so anyone can use this to interact with artifacts when developing actions. You can find the package here (lots of documentation and extra info):
https://www.npmjs.com/package/@actions/artifact
https://github.com/actions/toolkit/tree/master/packages/artifact
Since v2-preview
is effectively a total rewrite from v1
, There is a huge potential for bugs so it needs to be tested thoroughly before creating an actual v2
release. We need help testing the core functionality which includes:
There will be no new features added as part of the v2-preview
, we need to test the core functionality first and once a v2
release is out, then we can start chipping away at issues/features that we have been unable to address with v1
🙂 For the moment, please don't submit PRs for any new features to the v2-preview
branch.
How do we download an artifact from another repository?
Or even another branch? Say I'm on a branch and would like to pull an artifact from main to compare/test against
In gitlab:
needs:
- project: ext_project\proj_name
job: build
ref: master
artifacts: true
In Github, only option seems to be from existing workflow.
- uses: actions/download-artifact@v2
with:
name: my-artifact
path: path/to/artifact
I see this API: https://docs.github.com/en/rest/reference/actions#list-artifacts-for-a-repository
But I'm not sure if it's possible to list by git ref? (master, main, branch name etc...)
Just shows all artifacts?
During the beta of GitHub Actions, a lot of documentation (including the README for this action) showed example YAML with master
being referenced:
uses: actions/download-artifact@master
We currently encourage users to not use master
as a reference whenever an action is being used. Instead, one of the available tags should be used such as:
uses: actions/download-artifact@v1
Tags are used alongside semantic versioning to provide a stable experience: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-actions#versioning-your-action
The master branch can abruptly change without notice during development which can cause workflows to unexpectedly fail. If you want to avoid unexpected changes or failures, you should use a tag when referring to a specific version of an action. Tags are added to stable versions that have undergone significant testing and should not change.
The v2
versions of download-artifact
and upload-artifact
are currently in development. Expect changes to start showing up in a releases/v2-beta
branch around mid-late January. These new changes will eventually be merged into master
(we will communicate about this in the future) and that will have the potential to break your workflows if you are using @master
so you will have to react by updating your YAML to use a tag.
Our telemetry indicates a significant amount of users are using @master
. Good practice for all actions is to use a tag instead of referencing @master
.
Using Github Actions to build Docker images. Until now it was just alpine images but I also want to create debian based images as well. Those images are significantly bigger and as such it takes longer to upload them as an artifact. These images are needing 4GB in size and at this point have a 100% fail rate.
Link to the logs: https://github.com/WyriHaximusNet/docker-php/runs/973243081?check_suite_focus=true#step:4:217
Raw logs:
2020-08-11T21:39:57.0777951Z shell: /bin/bash -e {0}
2020-08-11T21:39:57.0778133Z env:
2020-08-11T21:39:57.0778273Z DOCKER_IMAGE: wyrihaximusnet/php
2020-08-11T21:39:57.0778401Z DOCKER_BUILDKIT: 1
2020-08-11T21:39:57.0778533Z ##[endgroup]
2020-08-11T21:39:57.1053333Z % Total % Received % Xferd Average Speed Time Time Time Current
2020-08-11T21:39:57.1055891Z Dload Upload Total Spent Left Speed
2020-08-11T21:39:57.1058114Z
2020-08-11T21:39:57.1339680Z 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
2020-08-11T21:39:57.1342375Z 100 636 100 636 0 0 28909 0 --:--:-- --:--:-- --:--:-- 28909
2020-08-11T21:39:57.2466546Z
2020-08-11T21:39:57.2471653Z 100 9631k 100 9631k 0 0 66.7M 0 --:--:-- --:--:-- --:--:-- 66.7M
2020-08-11T21:39:57.2644972Z ##[group]Run actions/download-artifact@v2
2020-08-11T21:39:57.2645158Z with:
2020-08-11T21:39:57.2645277Z name: docker-image-zts-zts-7.2-debian-buster-buster
2020-08-11T21:39:57.2645392Z path: ./docker-image
2020-08-11T21:39:57.2645629Z env:
2020-08-11T21:39:57.2645739Z DOCKER_IMAGE: wyrihaximusnet/php
2020-08-11T21:39:57.2646008Z DOCKER_BUILDKIT: 1
2020-08-11T21:39:57.2646096Z ##[endgroup]
2020-08-11T21:39:57.3162022Z Starting download for docker-image-zts-zts-7.2-debian-buster-buster
2020-08-11T21:39:57.4899696Z Directory structure has been setup for the artifact
2020-08-11T21:39:57.4902004Z Total number of files that will be downloaded: 2
2020-08-11T21:39:58.5764113Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:00.0779400Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:00.4904782Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:01.4906038Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:02.4916714Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:03.4904233Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:04.4903744Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:05.4903670Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:06.4909034Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:07.4922752Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:08.4918993Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:09.4918001Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:10.4913830Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:11.4914718Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:12.4923225Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:13.4920997Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:14.4919687Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:15.4951356Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:16.4935968Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:17.4924684Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:18.4925761Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:19.4925567Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:20.4936529Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:21.4936511Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:22.4941829Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:23.4978986Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:24.4933672Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:25.4933449Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:26.4938388Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:27.4932836Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:28.4935559Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:29.4933681Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:30.4941626Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:31.4932980Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:32.4939034Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:33.4933397Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:34.4933140Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:35.4932911Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:36.4934796Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:37.4932224Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:38.4934869Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:39.4934086Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:40.4932834Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:41.4932245Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:42.4932812Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:43.4933510Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:44.4938049Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:45.4933056Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:46.4941976Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:47.4933578Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:48.4938779Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:49.4933501Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:50.4933156Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:51.4932703Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:52.4934229Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:53.4932286Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:54.4939427Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:55.4938235Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:56.4932843Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:57.4933654Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:58.4932692Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:40:59.4933883Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:00.4949286Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:01.4948475Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:02.4952259Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:03.4958722Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:04.4953325Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:05.4952838Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:06.4965376Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:07.4978917Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:08.4992608Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:09.5006227Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:10.5011437Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:11.5014127Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:12.5027756Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:13.5041395Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:14.5044670Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:15.5059356Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:16.5073195Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:17.5086940Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:18.5092171Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:19.5104762Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:20.5116841Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:21.5129158Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:22.5141266Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:23.5154293Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:24.5166472Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:25.5179389Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:26.5182995Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:27.5184321Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:28.5196407Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:29.5209433Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:30.5221337Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:31.5234172Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:32.5246761Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:33.5258765Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:34.5271014Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:35.5274753Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:36.5296657Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:37.5299986Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:38.5311864Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:39.5313832Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:40.5327411Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:41.5339088Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:42.5351118Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:43.5354519Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:44.5366958Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:45.5379869Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:46.5392289Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:47.5396590Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:48.5408589Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:49.5421279Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:50.5434512Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:51.5448988Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:53.1085651Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:53.5473106Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:54.5485705Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:55.5498128Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:57.1097414Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:57.5521456Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:58.5533988Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:41:59.5548795Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:00.5560274Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:01.5572408Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:02.5575362Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:03.5587588Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:04.5600146Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:05.5613336Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:06.5615760Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:07.5628951Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:08.5641893Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:09.5654112Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:10.5666278Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:11.5678714Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:12.5690815Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:13.5702492Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:14.5715101Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:15.5728465Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:16.5740930Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:17.5753927Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:18.5765764Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:19.5778338Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:20.5790948Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:21.5803198Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:22.5815230Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:23.5821178Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:24.5825165Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:25.5837763Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:26.5849585Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:27.5863682Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:28.5866854Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:29.5878551Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:31.1191555Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:31.5894527Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:32.5908115Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:33.5920167Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:34.5932497Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:35.5936618Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:36.5947897Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:37.5959928Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:38.5973610Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:39.5985307Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:40.6000876Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:41.6012347Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:42.6024818Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:43.6037394Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:44.6049813Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:45.6061235Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:46.6064005Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:47.6076484Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:48.6080490Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:49.6092709Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:50.6101175Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:51.6113692Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:52.6126064Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:53.6138787Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:54.6151283Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:55.6164205Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:56.6177591Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:57.6181314Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:58.6194497Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:42:59.6206510Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:00.6218842Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:01.6232363Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:02.6234261Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:03.6246572Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:04.6251259Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:05.6264740Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:06.6276784Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:07.6289168Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:08.6301933Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:10.1307213Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:10.6316949Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:11.6331857Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:12.6333550Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:13.6336348Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:14.6349482Z Total file count: 2 ---- Processed file #1 (50.0%)
2020-08-11T21:43:15.3322944Z events.js:187
2020-08-11T21:43:15.3323908Z throw er; // Unhandled 'error' event
2020-08-11T21:43:15.3324079Z ^
2020-08-11T21:43:15.3324122Z
2020-08-11T21:43:15.3324563Z Error: unexpected end of file
2020-08-11T21:43:15.3324845Z at Zlib.zlibOnError [as onerror] (zlib.js:170:17)
2020-08-11T21:43:15.3325319Z Emitted 'error' event on Gunzip instance at:
2020-08-11T21:43:15.3325605Z at Zlib.zlibOnError [as onerror] (zlib.js:173:8) {
2020-08-11T21:43:15.3325824Z errno: -5,
2020-08-11T21:43:15.3326024Z code: 'Z_BUF_ERROR'
2020-08-11T21:43:15.3326106Z }
2020-08-11T21:43:15.3441578Z ##[group]Run exit 0
2020-08-11T21:43:15.3441749Z �[36;1mexit 0�[0m
The baseline behavior of the zip
utilty on Linux and macOS is to retain permissions.
However, when the download-artifact
action zips a directory, it loses permissions. With permissions broken and lost in the resulting asset zipfile, this subsequently breaks the artifacts for users or for downstream automated scripts.
Yes, one option is to write a "github-download-permission-fixer" script that users must run after extracting the assets. However, this could be a lot of work to write and maintain, and hard to justify if it's simply to work-around broken behavior by GitHub. Likewise, if you're dealing with multiple platforms (Windows, macOS, Linux), this script will need to be cross-platform and conform to the lowest-common-denominator available on all platforms (such as bash 3.x on macOS). The alternative is to ask users to install prerequisites (such as bash 4.x) before they can run the github-permission-fixer script.
Expected behavior: permissions, as defined and applied by prior steps in the workflow to the resulting asset files and directories, should not be dropped or ignored by the download-artifact
zipper, and should be present in the resulting asset zip file.
Hi @tarcila, the bug is due to the fact that there are two folders both with files different only in their casing:
/installed/x64-linux/share/Catch2
/installed/x64-linux/share/catch2
It seems that the download-artifact action is creating only one of the folders thinking the other one is the same. We're looking into how best fix this issue.
Originally posted by @lkillgore in #12 (comment)
In one of my workflows, I have one job that makes an artifact and uploads it using actions/upload-artifact
, then a dependent job downloads it using actions/download-artifact
. Most of the time, this works well enough. However, I've seen multiple spurious 404 errors such as what I've pasted below. Rerunning the workflow typically works.
Starting download for build-artifacts
Directory structure has been setup for the artifact
Total number of files that will be downloaded: 156
##### Begin Diagnostic HTTP information #####
Status Code: 404
Status Message: Not Found
Header Information: {
"cache-control": "no-store,no-cache",
"pragma": "no-cache",
"content-length": "348",
"content-type": "application/json; charset=utf-8",
"strict-transport-security": "max-age=2592000",
"x-tfs-processid": "af0933a7-c1b0-4385-8a55-6aac682d6c86",
"activityid": "b842aaf9-4b84-4c10-8352-6988ba2c577e",
"x-tfs-session": "b842aaf9-4b84-4c10-8352-6988ba2c577e",
"x-vss-e2eid": "b842aaf9-4b84-4c10-8352-6988ba2c577e",
"x-vss-senderdeploymentid": "13a19993-c6bc-326c-afb4-32c5519f46f0",
"x-frame-options": "SAMEORIGIN",
"x-msedge-ref": "Ref A: 24E9DF59C6DF41AEB0B4F4ADD82669E6 Ref B: BLUEDGE0213 Ref C: 2020-10-29T15:55:07Z",
"date": "Thu, 29 Oct 2020 15:55:07 GMT"
}
###### End Diagnostic HTTP information ######
Error: Unable to download the artifact: Error: Unexpected http 404 during download for https://pipelines.actions.githubusercontent.com/5VR8pVAJWOch570pQRgQ7EctEDmSm2LFZLvggBFEZdahLnzvWU/_apis/resources/Containers/3686237?itemPath=build-artifacts%2Fservice%2Fapi-service.zip
Example failing run: https://github.com/microsoft/onefuzz/pull/229/checks?check_run_id=1327596964
Support flag to create destination directory.
For comparison, infozip 6.0's unzip -d
creates the specified directory.
could we download multiple folders at once ?
- name: Download results
continue-on-error: true
uses: actions/download-artifact@v1
with:
name: functional_test_docker_ubuntu
name: barmetal_test_docker_ubuntu
? or do I have to repeat this block if I wanna download 8 folders?
Randomly, rarely this action has a chance to throw ETIMEDOUT and crash the entire build. Currently this happened exactly once here: https://github.com/BlockProject3D/Framework/pull/95/checks?check_run_id=912239366.
The affected OS is for now limited to OSX.
Run actions/download-artifact@v2
1m 16s
##[error]connect ETIMEDOUT 13.107.42.16:443
Run actions/download-artifact@v2
Starting download for Snapshot-macos-latest-Debug-x64
##[error]connect ETIMEDOUT 13.107.42.16:443
In my pipelines I see artifact download problems related to network from time to time. These fails are not very frequent, but they do happen.
Here's an example of the problem I see (you can see the logs here:
Starting download for ispc_llvm11_linux
##### Begin Diagnostic HTTP information #####
Status Code: 503
Status Message: Service Unavailable
Header Information: {
"cache-control": "no-store",
"content-length": "228",
"content-type": "text/html",
"server": "Microsoft-IIS/10.0",
"x-msedge-ref": "Ref A: F6F0AF70C73C4DA5BA5EF5FA16A46967 Ref B: CO1EDGE0910 Ref C: 2020-12-07T08:15:57Z",
"date": "Mon, 07 Dec 2020 08:16:15 GMT",
"connection": "close"
}
###### End Diagnostic HTTP information ######
Error: Unable to get ContainersItems from https://pipelines.actions.githubusercontent.com/vRWS246tQliPVeJkfd7cOfXg6DKlc5d3gGpSXpb3WtbhcjV7Xt/_apis/resources/Containers/8374578?itemPath=ispc_llvm11_linux
The artifact does exist and other jobs are able to download it. The most logical way to fight this kind of failures is to do retry with a timeout. Something like retry 3 times with 15 seconds timeout sounds like a reasonable thing to do in this case.
Could you please support retries?
the support for the newly introduced event "worgkflow_run" for github actions seams broken.
playground for github event "worgkflow_run".
For example, if your
pull_request
workflow generates build artifacts,
you can create a new workflow that usesworkflow_run
to analyze the
results and add a comment to the original pull request.
The goal of tis playground is to see if downloading and processing
artifacts in a "worgkflow_run" eent works as expected.
in a "worgkflow_run" triggered event
i can download artifacts from the workflow that triggered the "worgkflow_run" trigger.
the action "download-artifact" semas to be broken on event "worgkflow_run".
Similar to #11 but I would like to be able to see the paths to the downloaded files in the log. Perhaps the 'variable' output implemented for #11 would show up in the log and satisfy this request. My issue is regularly apparently getting it wrong as to what to enter in the path parameter and ending up with stuff like below where the file is not found in a subsequent step and I have to either add a tmate action for debug or other debug commands or just start guessing...
Example:
https://github.com/oleg-andreyev/MinkPhpWebDriver/runs/1606431104
vs
https://github.com/oleg-andreyev/MinkPhpWebDriver/runs/1606431104?check_suite_focus=true
Notice that "Artifacts" are missing.
The difference only is check_suite_focus
, it's quite inconsistent to download artifacts.
One of the really useful things about uploading artifacts is caching logs, coverage reports, et cetera. Those often have timestamps or UUIDs as names.
You can hack around it with a compression to a known name step, but, uploading and downloading artifacts by wildcard (or even just download-all) would be a huge benefit for post-processing, eg coverage merging
Given that the 'path' is optional input (it may come from the upload action), it would be nice if this action provided an output 'path'. This would reduce duplication of data in subsequent steps, which have no knowledge of inputs passed to the step containing the upload action.
Hi,
I am discovering gh actions, and am running into an issue with artifacts. When trying to download a specific artifact, it fails with the following message:
##[warning]Fail to download 'test/installed/x64-linux/share/Catch2/gdbinit', error: Could not find a part of the path '/home/runner/work/gha-download-bug/gha-download-bug/test/installed/x64-linux/share/Catch2/gdbinit'. (Downloader 1)
##[warning]Back off 25.301 seconds before retry. (Downloader 1)
##[warning]Fail to download 'test/installed/x64-linux/share/Catch2/lldbinit', error: Could not find a part of the path '/home/runner/work/gha-download-bug/gha-download-bug/test/installed/x64-linux/share/Catch2/lldbinit'. (Downloader 0)
##[warning]Back off 22.004 seconds before retry. (Downloader 0)
Not sure what makes this artifact problematic. For a reproducer, see:
For example:
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1929 (3858.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1981 (3001.5%)
Run actions/download-artifact@v2
with:
path: artifacts
No artifact name specified, downloading all artifacts
Creating an extra directory for each artifact that is being downloaded
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #25 (50.0%)
Total file count: 50 ---- Processed file #51 (102.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #82 (124.2%)
Total file count: 66 ---- Processed file #108 (163.6%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #149 (413.8%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #193 (386.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #243 (368.1%)
Total file count: 66 ---- Processed file #272 (412.1%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #307 (852.7%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #349 (698.0%)
Total file count: 50 ---- Processed file #367 (734.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total file count: 50 ---- Processed file #371 (742.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #401 (607.5%)
Total file count: 66 ---- Processed file #418 (633.3%)
Total file count: 66 ---- Processed file #433 (656.0%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total file count: 66 ---- Processed file #437 (662.1%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #465 (1291.6%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #505 (1010.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #558 (845.4%)
Total file count: 66 ---- Processed file #583 (883.3%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #623 (1730.5%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total file count: 36 ---- Processed file #631 (1752.7%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #665 (1330.0%)
Total file count: 50 ---- Processed file #687 (1374.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #714 (1081.8%)
Total file count: 66 ---- Processed file #744 (1127.2%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #782 (2172.2%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #824 (1648.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #874 (1324.2%)
Total file count: 66 ---- Processed file #902 (1366.6%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #934 (2594.4%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #981 (1962.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1024 (1551.5%)
Total file count: 66 ---- Processed file #1044 (1581.8%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1095 (3041.6%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1139 (2278.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1184 (1793.9%)
Total file count: 66 ---- Processed file #1212 (1836.3%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1253 (3480.5%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1297 (2594.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1347 (2040.9%)
Total file count: 66 ---- Processed file #1375 (2083.3%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1414 (3927.7%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1456 (2912.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1506 (2281.8%)
Total file count: 66 ---- Processed file #1537 (2328.7%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1573 (4369.4%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1613 (3226.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1664 (2521.2%)
Total file count: 66 ---- Processed file #1694 (2566.6%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1731 (4808.3%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1771 (3542.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1818 (2754.5%)
Total file count: 66 ---- Processed file #1845 (2795.4%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #1886 (5238.8%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #1929 (3858.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #1981 (3001.5%)
Total file count: 66 ---- Processed file #2011 (3046.9%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #2045 (5680.5%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #2087 (4174.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #2139 (3240.9%)
Total file count: 66 ---- Processed file #2168 (3284.8%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #2205 (6125.0%)
Total number of files that will be downloaded: 6
Total number of files that will be downloaded: 50
Total file count: 50 ---- Processed file #2247 (4494.0%)
Total number of files that will be downloaded: 66
Total file count: 66 ---- Processed file #2295 (3477.2%)
Total file count: 66 ---- Processed file #2318 (3512.1%)
Total number of files that will be downloaded: 36
Total file count: 36 ---- Processed file #2363 (6563.8%)
There were 60 artifacts downloaded
Example IRL: https://github.com/TWiStErRob/net.twisterrob.gradle/pull/110/checks?check_run_id=2595895174
I am playing around with upload and download of artifacts, and have so far not managed to perform a download, though the upload seems to succeed. Its very likely just me using it incorrectly, but error generated is not very clear to say the least:
Here is one version of my workflow file causing this (as you might guess from the commented stuff, i have been trying a few different things):
name: ASP.NET Core CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 2.2.108
- name: Build with dotnet
run: dotnet build --configuration Release
- name: Test with dotnet
run: dotnet test -c Release --no-build
- name: Publish with dotnet
run: dotnet publish -c Release --no-build -o output
- name: Upload
uses: actions/upload-artifact@master
with:
name: output
path: output
process:
runs-on: windows-latest
steps:
- name: Mkdir
run: mkdir output
- name: Download
uses: actions/download-artifact@master
with:
name: output
# path: output
# - name: List download
# run: dir ./output
The link on the template for issues to GiHub Community Forum's Actions Board (https://github.community/t5/GitHub-Actions/bd-p/actions) is not reachable to regular users like me. Maybe it changed https://github.community/c/github-actions/41 ?
(I was about to open an issue here but I finally opened a question there, hope to understand what I might be missing.)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.