Code Monkey home page Code Monkey logo

benchmarks's Introduction

Coiled Benchmarks

Tests Linting Benchmarks

Set of Dask benchmarks run daily at scale in Coiled Clusters.

Test Locally (for developers)

The coiled benchmarks test suite can be run locally with the following steps:

  1. Ensure your local machine is authenticated to use the dask-benchmarks Coiled account and the Coiled Dask Engineering AWS S3 account.

  2. Create a new Python environment with mamba env create -n test-env -f ci/environment.yml. You could alternatively use different packages, e.g. if you are testing feature branches of dask or distributed. This test suite is configured to run Coiled's package_sync feature, so your local environment will be copied to the cluster.

  3. Activate the environment with conda activate test-env

  4. Some tests require additional dependencies:

    • For snowflake: mamba env update -f ci/environment-snowflake.yml
    • For non-dask TPCH: mamba env update -f ci/environment-tpch-nondask.yml

    Look at ci/environment-*.yml for more options.

  5. Upgrade dask to the git tip with mamba env update -f ci/environment-git-tip.yml

  6. Add test-specific packages with mamba env update -f ci/environment-test.yml

  7. Run tests with python -m pytest tests. You may consider running instead individual tests or categories of tests; e.g. python -m pytest tests -m shuffle_p2p. Look at setup.cfg for all available test markers.

Additionally, tests are automatically run on pull requests to this repository. See the section below on creating pull requests.

Benchmarking

The coiled-benchmarks test suite contains a series of pytest fixtures which enable benchmarking metrics to be collected and stored for historical and regression analysis. By default, these metrics are not collected and stored, but they can be enabled by including the --benchmark flag in your pytest invocation.

From a high level, here is how the benchmarking works:

  • Data from individual test runs are collected and stored in a local sqlite database. The schema for this database is stored in benchmark_schema.py
  • The local sqlite databases are appended to a global historical record, stored in S3.
  • The historical data can be analyzed using any of a number of tools. dashboard.py creates a set of static HTML documents showing historical data for the tests.

Running the benchmarks locally

You can collect benchmarking data by running pytest with the --benchmark flag. This will create a local benchmark.db sqlite file in the root of the repository. If you run a test suite multiple times with benchmarking, the data will be appended to the database.

You can compare with historical data by downloading the global database from S3 first:

aws s3 cp s3://coiled-runtime-ci/benchmarks/benchmark.db ./benchmark.db
pytest --benchmark

TPC-H Benchmarks

To get going with the TPC-H benchmarks, checkout the README here

Changing the benchmark schema

You can add, remove, or modify columns by editing the SQLAlchemy schema in benchmark_schema.py. However, if you have a database of historical data, then the schemas of the new and old data will not match. In order to account for this, you must provide a migration for the data and commit it to the repository. We use alembic to manage SQLAlchemy migrations. In the simple case of simply adding or removing a column to the schema, you can do the following:

# First, edit the `benchmark_schema.py` and commit the changes.

alembic revision --autogenerate -m "Description of migration"
git add alembic/versions/name_of_new_migration.py
git commit -m "Added a new migration"

Migrations are automatically applied in the pytest runs, so you needn't run them yourself.

Deleting old test data

At times you might change a specific test that makes older benchmarking data irrelevant. In that case, you can discard old benchmarking data for that test by applying a data migration removing that data:

alembic revision -m "Declare bankruptcy for <test-name>"
# Edit the migration here to do what you want.
git add alembic/versions/name_of_new_migration.py
git commit -m "Bankruptcy migration for <test-name>"

An example of a migration that does this is here.

Using the benchmark fixtures

We have a number of pytest fixtures defined which can be used to automatically track certain metrics in the benchmark database. The most relevant ones are summarized here:

benchmark_time: Record wall clock time duration.

benchmark_memory: Record memory usage.

benchmark_task_durations: Record time spent computing, transferring data, spilling to disk, and deserializing data.

get_cluster_info: Record cluster id, name, etc.

benchmark_all: Record all available metrics.

For more information on all available fixtures and examples on how to use them, please refer to their documentation.

Writing a new benchmark fixture would generally look like:

  1. Requesting the test_run_benchmark fixture, which yields an ORM object.
  2. Doing whatever setup work you need.
  3. yielding to the test
  4. Collecting whatever information you need after the test is done.
  5. Setting the appropriate attributes on the ORM object.

The benchmark_time fixture provides a fairly simple example.

Investigating performance regressions

It is not always obvious what the cause of a seeming performance regression is. It could be due to a new version of a direct or transitive dependency, and it could be due to a change in the Coiled platform. But often it is due to a change in dask or distributed. If you suspect that is the case, Coiled's package_sync feature combines well with the benchmarking infrastructure here and git bisect.

The following is an example workflow which could be used to identify a specific commit in dask which introduced a performance regression. This workflow presumes you have two terminal windows open, one with the coiled-runtime test suite, and one with a dask repository with which to drive bisecting.

Create your software environment

You should create a software environment which can run this test suite, but with an editable install of dask. You can do this in any of a number of ways, but one approach coule be

mamba env create -n test-env --file ci/environment.yml  # Create a test environment
conda activate test-env  # Activate your test environment
mamba env update --file ci/environment-test.yml
(cd <your-dask-dir> && pip install -e .)  # Do an editable install for dask

Start bisecting

Let's say the current HEAD of dask is known to be bad, and $REF is known to be good. If you are looking at an upstream run where you have access to the static page, you can check the dates reported for each run and do a git log with the corresponding dates to get a list of commits to use in the bisecting process.

git log --since='2022-08-15 14:15' --until='2022-08-18 14:15' --pretty=oneline In the terminal opened to your dask repository you can initialize a bisect workflow with

cd <your-dask-dir>
git bisect start
git bisect bad
git bisect good $REF

Test for regressions

Now that your editable install is bisecting, run a test or subset of tests which demonstrate the regression in your coiled-runtime terminal. Presume that tests/benchmarks/test_parquet.py::test_write_wide_data is such a test:

pytest tests/benchmarks/test_parquet.py::test_write_wide_data --benchmark

Once the test is done, it will have written a new entry to a local sqlite file benchmark.db. You will want to check whether that entry displays the regression. Exactly what that check will look like will depend on the test and the regression. You might have a script that builds a chart from benchmark.db similar to dashboard.py, or a script that performs some kind of statistical analysis. But let's assume a simpler case where you can recognize it from the average_memory. You can query that with

sqlite3 benchmark.db "select dask_version, average_memory from test_run where name = 'test_write_wide_data';"

If the last entry displays the regression, mark the commit in your dask terminal as bad:

git bisect bad

If the last entry doesn't display the regression, mark the commit in your dask terminal as good:

git bisect good

Proceed with this process until you have narrowed it down to a single commit. Congratulations! You've identified the source of your regression.

A/B testing

It's possible to run the Coiled Runtime benchmarks for A/B comparisons. Read full documentation.

Contribute

This repository uses GitHub Actions secrets for managing authentication tokens used to access resources like Coiled clusters, S3 buckets, etc. However, because GitHub Actions doesn't grant access to secrets for forked repositories, please submit pull requests directly from the coiled/benchmarks repository, not a personal fork.

License

BSD-3

benchmarks's People

Contributors

crusaderky avatar dependabot[bot] avatar fjetter avatar gjoseph92 avatar hayesgb avatar hendrikmakait avatar ian-r-rose avatar j-bennet avatar jrbourbeau avatar marcosmoyano avatar milesgranger avatar mrocklin avatar ncclementi avatar ntabris avatar phobson avatar phofl avatar ritchie46 avatar sjperkins avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

benchmarks's Issues

Allow for optional dependencies

The current implementation of the coiled-runtime metapackage performs a build of all dependencies by the user, and all packages are also installed in the [Coiled software environments].(https://github.com/coiled/software-environments).

This arrangement benefits the user, by assuring all packages are available. However, it also creates a larger than necessary docker image, which needs to be uploaded to workers when starting a cluster, since packages like Jupyter or matpltolib are not required by the worker or scheduler.

One way to resolve this issue would be to optionally install only specific packages when building the docker image. This is one way to reduce the image size, thereby improving Coiled UX.

cc: @dchudz

Invalid Coiled token

It looks like the token we've been using in CI is no longer valid

Run # If testing the latest coiled-runtime, we need to build a Coiled software environment
Creating Coiled software environment for dask-engineering/coiled-runtime-pull_request-47-d679a56bbeb127140cd56d477d6a7bd9e8032ad2-ubuntu-latest-py38
Invalid Coiled token encountered. You can create new tokens and manage your
existing ones at https://cloud.coiled.io/profile.

Please login to https://cloud.coiled.io/profile to get your token
/usr/share/miniconda3/envs/test/lib/python3.8/getpass.py:91: GetPassWarning: Can not control echo on the terminal.
passwd = fallback_getpass(prompt, stream)
Token:
Warning: Password input may be echoed.
Aborted!

See this CI build for an example.

cc @ncclementi any thoughts on why this might be?

CI shows green but there were errors

I'm seeing in some cases that the CI is showing green but the dropdown has an ERROR, see screenshot. Not sure what's happening, but this shouldn't show as green.

Screen Shot 2022-04-13 at 7 18 08 PM

Stop testing windows and mac?

Since most of this code just calls Coiled and asks it to do things it's not clear to me that there is much value to cross-platform testing here. Really we're just mostly testing that the client APIs work, and this seems more relevant for the tests in each project.

I propose dropping windows/osx testing because it probably doesn't bring in much additional signal (Coiled is running Linux regardless). Thoughts?

Create GHA to automatically run & record Coiled Distribution H2O.ai benchmark Performance

Users and maintainers of dask want to understand how dask performs in a variety of benchmarks against a variety of competitive technologies. One such benchmark is the H2O.ai benchmark.

I propose using the H2O.ai repository to create datasets of size 0.5GB, 5GB, and 50GB. These csvfiles should be stored in s3://coiled-data/h2o-ai/XYZ-SIZE-CSVFILES/*.csv. Alternatively, MPowers has worked with the H2O.ai team and may have access to the test data.

The benchmark should be executed on the Coiled Cloud infrastructure using default dask worker configurations, and the sample environment described in the benchmark.

I would propose we record and report the following:

  • Time to spin up infrastructure
  • Mean time to load the data from S3
  • Mean time to execute each test

Failure mode, if/when it occurs.

Make sure there's a notification for each CI workflow failure on `main`

Our current logic around opening up an issue when there's a test failure on main

https://github.com/coiled/coiled-runtime/blob/89838518a21c401eccb7a6bd6e7397c1172d1ec8/.github/workflows/tests.yml#L164-L204

Will check to see if there's already an issue opened by github-actions[bot] and, if there is, update the initial issue content to point to the later GHA workflow to fail. Unfortunately, this almost always goes unnoticed as we don't get notifications if a comment on an issue is updated.

We should update our issue reporting logic so that we don't miss when there's a new CI failure on main. We could (for example):

  • If there's already an issue, leave a new comment instead of updating the original comment
  • Open a new issue every time the test suite doesn't pass on main
  • ...some other option. Suggestions are welcome

Regardless of what we do, the goal here is to make sure folks watching this repo get a notification on GitHub every time there's a CI workflow failure on main

Add usage instructions

Here's my understanding of the current usage instructions. Should we add these to the README?

Create a software environment with the coiled-runtime.

coiled.create_software_environment(
    name = "my_env",
    conda = {"channels": ["conda-forge", "coiled"],
             "dependencies": ["coiled-runtime", "Python=3.9.12"]
            },
)

Add the conda-forge and coiled channels if you haven't done so already:

conda config --add channels conda-forge
conda config --add channels coiled

Create a local environment:

conda create -n crt -y python=3.9.12 coiled-runtime

Spin up Coiled clusters like this:

cluster = coiled.Cluster(
    name="github-json",
    software="matthew-powers/my_env", 
    n_workers=10,
)

This will ensure that the local software versions and cluster software versions are the same.

Environment export in CI not working

In CI we have a step that dumps out the current environment being used right before we run tests

https://github.com/coiled/coiled-runtime/blob/0667395184944e17968a1211672de3e8b24c497a/.github/workflows/tests.yml#L60-L64

This is useful for debugging CI builds. Unfortunately, when I look at the output on CI builds it's unexpectedly empty

--
--Conda Environment (re-create this with `conda env create --name <name> -f <output_file>`)
--
name: test
channels:
  - defaults

EDIT: low priority, but would be good to fix at some point

Avoid version mismatch warning

When users connect their client to a cluster, a warning is emitted if there is a version mismatch found for any of these packages on any node throughout the cluster. We should ensure that if a specific version of coiled-runtime is used, that we never emit a version mismatch warning.

Today we pin some of the relevant packages

https://github.com/coiled/coiled-runtime/blob/0667395184944e17968a1211672de3e8b24c497a/recipe/meta.yaml#L21-L44

like dask, distributed, numpy, lz4, etc. but not all (e.g. cloudpickle, tornado, etc). This means that, for example, if someone installs coiled-runtime=0.0.3 locally today and then tomorrow a new cloudpickle release comes out, a version mismatch warning will be emitted when that local environment connects to a fresh Coiled cluster using a coiled-runtime=0.0.3-based software environment.

I suggest we pin all packages in coiled-runtime which can lead to a Dask version mismatch warning. Though thoughts from others are certainly welcome.

Python version pinning

If I installed environment with the environment below (python is not pinned)

name: test_coiled
channels:
- conda-forge
dependencies:
- python
- coiled-runtime=0.0.3

Then python 3.10 is installed on my machine (MacOS). However, later coiled asks for python 3.9 version.

Steps to reproduce:

  • save the above yaml as environment.yml
conda env create -f environment.yml
conda activate test_coiled
python --version

The above prints 3.10.4.

Periodically test development `dask` and `distributed` versions

Following #90, we can now test against the main branch of dask and distributed. However, this is an opt-in process by including test-upstream in a commit message. It would be good for us to also run against the main branch of dask and distributed periodically (e.g. every Monday morning) to make sure we're regularly testing against main.

Add Coiled quickstart

I'd like us to add a simple test which runs the Coiled quickstart against coiled-runtime

import coiled
cluster = coiled.Cluster(n_workers=10, account="dask-engineering")

from dask.distributed import Client
client = Client(cluster)

import dask.dataframe as dd

df = dd.read_csv(
    "s3://nyc-tlc/trip data/yellow_tripdata_2019-*.csv",
    dtype={
        "payment_type": "UInt8",
        "VendorID": "UInt8",
        "passenger_count": "UInt8",
        "RatecodeID": "UInt8",
    },
    storage_options={"anon": True},
    blocksize="16 MiB",
).persist()

df.groupby("passenger_count").tip_amount.mean().compute()

Push Coiled software environments to `coiled` account

Currently we're building Coiled software environments for each coiled-runtime release and pushing them to our dask-engineering Coiled account

https://github.com/coiled/coiled-runtime/blob/ab53f78fedad1454ef5e97beda4d5f09b4b2b183/.github/workflows/software-environments.yml#L43

Let's start pushing those to the coiled account instead. This should require:

  1. Making sure the corresponding CI job that builds software envs has the appropriate permissions to push to the coiled account
  2. Updating the coiled env create ... command linked above
  3. Updating our test suite to use the new software environment location

Document testing/CI setup

We should document how tests are run in CI and how developers can run tests locally. This is just a placeholder for now -- I'll add more details later

`read_parquet` + `shuffle` + `to_parquet` performance observation

Over in #67, @sjperkins discovered that using compute=False + dask.compute(...) in the following workload:

    df_shuffled = dd.read_parquet(dataset_url, storage_options=s3_storage_options)
    df_shuffled = df_shuffled.shuffle(on="x")
    result = df_shuffled.to_parquet(
        shuffled_url, compute=False, storage_options=s3_storage_options
    )
    dask.compute(result)

ran much faster than using the default compute=True in the to_parquet call:

    df_shuffled = dd.read_parquet(dataset_url, storage_options=s3_storage_options)
    df_shuffled = df_shuffled.shuffle(on="x")
    df_shuffled.to_parquet(shuffled_url, storage_options=s3_storage_options)

The compute=False + dask.compute(...) snippet took ~200 seconds to run, while the compute=True case took ~1800 seconds to run. This was surprising as I would have expected both approaches to take roughly the same amount of time. If anything I would have expected the delayed compute to be slower due to somehow losing blockwise task fusion.

We should investigate a bit more and open an upstream issue in dask/dask. Opening an issue to not lose track of this.

Use `conda`/`mamba` only for `upstream` build

As a follow-up to #90, we should switch to using only conda/mamba for installing dev versions of dask and distributed in our upstream build. See the relevant code changes outlined in #90 (comment). Currently anaconda is experiencing some issues where packages sometimes can't be downloaded (see dask/dask#8984 (comment)). After those are resolved, we should be good to move forward on this ticket.

installing coiled-runtime defaults to old `coiled` package

I expected that when I install the coiled-runtime, I would get the latest coiled package version, which right now is 0.0.73 (https://pypi.org/project/coiled/).

What happened: When I installed coiled-runtime, I got coiled 0.0.71.

OSX, Python 3.9, conda

Details

Last login: Mon Apr 11 18:13:41 on ttys001
(base)  /Users/davidchudzicki $
conda create -n crt
WARNING: A conda environment already exists at '/Users/davidchudzicki/miniconda3/envs/crt'
Remove existing environment (y/[n])? ^C
CondaSystemExit:
Operation aborted.  Exiting.

(base)  /Users/davidchudzicki $
conda create -n crt2
Collecting package metadata (current_repodata.json): done
Solving environment: done


==> WARNING: A newer version of conda exists. <==
  current version: 4.11.0
  latest version: 4.12.0

Please update conda by running

    $ conda update -n base conda



## Package Plan ##

  environment location: /Users/davidchudzicki/miniconda3/envs/crt2



Proceed ([y]/n)?

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
#     $ conda activate crt2
#
# To deactivate an active environment, use
#
#     $ conda deactivate

(base)  /Users/davidchudzicki $
 conda activate crt2
(crt2)  /Users/davidchudzicki $
conda install -c coiled coiled-runtime
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: done


==> WARNING: A newer version of conda exists. <==
  current version: 4.11.0
  latest version: 4.12.0

Please update conda by running

    $ conda update -n base conda



## Package Plan ##

  environment location: /Users/davidchudzicki/miniconda3/envs/crt2

  added / updated specs:
    - coiled-runtime


The following NEW packages will be INSTALLED:

  abseil-cpp         conda-forge/osx-64::abseil-cpp-20210324.2-he49afe7_0
  aiobotocore        conda-forge/noarch::aiobotocore-2.2.0-pyhd8ed1ab_0
  aiohttp            conda-forge/osx-64::aiohttp-3.8.1-py39h63b48b0_1
  aioitertools       conda-forge/noarch::aioitertools-0.10.0-pyhd8ed1ab_0
  aiosignal          conda-forge/noarch::aiosignal-1.2.0-pyhd8ed1ab_0
  anyio              conda-forge/osx-64::anyio-3.5.0-py39h6e9494a_0
  appnope            conda-forge/noarch::appnope-0.1.3-pyhd8ed1ab_0
  argon2-cffi        conda-forge/noarch::argon2-cffi-21.3.0-pyhd8ed1ab_0
  argon2-cffi-bindi~ conda-forge/osx-64::argon2-cffi-bindings-21.2.0-py39h63b48b0_2
  arrow-cpp          conda-forge/osx-64::arrow-cpp-7.0.0-py39h13a4272_4_cpu
  asttokens          conda-forge/noarch::asttokens-2.0.5-pyhd8ed1ab_0
  async-timeout      conda-forge/noarch::async-timeout-4.0.2-pyhd8ed1ab_0
  atk-1.0            conda-forge/osx-64::atk-1.0-2.36.0-he69c4ee_4
  attrs              conda-forge/noarch::attrs-21.4.0-pyhd8ed1ab_0
  aws-c-cal          conda-forge/osx-64::aws-c-cal-0.5.11-hd2e2f4b_0
  aws-c-common       conda-forge/osx-64::aws-c-common-0.6.2-h0d85af4_0
  aws-c-event-stream conda-forge/osx-64::aws-c-event-stream-0.2.7-hb9330a7_13
  aws-c-io           conda-forge/osx-64::aws-c-io-0.10.5-h35aa462_0
  aws-checksums      conda-forge/osx-64::aws-checksums-0.1.11-h0010a65_7
  aws-sdk-cpp        conda-forge/osx-64::aws-sdk-cpp-1.8.186-h766a74d_3
  babel              conda-forge/noarch::babel-2.9.1-pyh44b312d_0
  backcall           conda-forge/noarch::backcall-0.2.0-pyh9f0ad1d_0
  backoff            conda-forge/noarch::backoff-1.11.1-pyhd8ed1ab_0
  backports          conda-forge/noarch::backports-1.0-py_2
  backports.functoo~ conda-forge/noarch::backports.functools_lru_cache-1.6.4-pyhd8ed1ab_0
  beautifulsoup4     conda-forge/noarch::beautifulsoup4-4.11.1-pyha770c72_0
  bleach             conda-forge/noarch::bleach-5.0.0-pyhd8ed1ab_0
  blinker            conda-forge/noarch::blinker-1.4-py_1
  bokeh              conda-forge/osx-64::bokeh-2.4.2-py39h6e9494a_0
  boto3              conda-forge/noarch::boto3-1.21.21-pyhd8ed1ab_0
  botocore           conda-forge/noarch::botocore-1.24.21-pyhd8ed1ab_1
  brotlipy           conda-forge/osx-64::brotlipy-0.7.0-py39h63b48b0_1004
  bzip2              conda-forge/osx-64::bzip2-1.0.8-h0d85af4_4
  c-ares             conda-forge/osx-64::c-ares-1.18.1-h0d85af4_0
  ca-certificates    conda-forge/osx-64::ca-certificates-2021.10.8-h033912b_0
  cachetools         conda-forge/noarch::cachetools-5.0.0-pyhd8ed1ab_0
  cairo              conda-forge/osx-64::cairo-1.16.0-h1680b09_1011
  certifi            conda-forge/osx-64::certifi-2021.10.8-py39h6e9494a_2
  cffi               conda-forge/osx-64::cffi-1.15.0-py39he338e87_0
  charset-normalizer conda-forge/noarch::charset-normalizer-2.0.12-pyhd8ed1ab_0
  click              conda-forge/osx-64::click-8.0.4-py39h6e9494a_0
  cloudpickle        conda-forge/noarch::cloudpickle-2.0.0-pyhd8ed1ab_0
  coiled             conda-forge/noarch::coiled-0.0.71-pyhd8ed1ab_0
  coiled-runtime     coiled/noarch::coiled-runtime-0.0.3-py_1

Add ability to test against `dask` and `distributed` `main` branches

In conversation with @fjetter he mentioned it would be useful for us to be able to test the runtime against development versions of dask and distributed. A couple of options come to mind:

  1. We check commit messages on PRs for the phrase test-upstream. This is similar to what we do today in dasks upstream CI build. We could probably reuse some logic from that workflow here.
  2. We could take a label-based approach (similar to what cloudpickle does) where if a PR has an upstream label, we install the dev versions of dask and distributed

Both options seems viable to me. I have a slight preference for the commit message option, as it seems a bit more targeted to me.

Stability Testing Report

The distributed team wishes to create a Stability Test for tracking how changes to the distributed server affect stability. The following ideas might be useful in scoping:

Initial conda-forge release

@hayesgb, @ncclementi mentioned it was decided that we should do a public release on conda-forge of the Coiled Runtime. There was a discussion on Slack around how to go about this technically, which I'll summarize below:

  • You cannot upload a package to conda-forge that isn't built by the conda-forge bots (i.e. going through the normal staged-recipes -> package-feedstock route)
  • The only additional content in the coiled-runtime repository is the test suite we run against the pinned package versions in the runtime. These are run today in this repository with GitHub actions, but we could package the tests up and also run them during the conda-forge build process. I see this providing value in the future, but not as a requirement for an initial release (feedback on this from others is welcome)
  • Next steps would be to issue a release (on GitHub) of the Coiled Runtime (at a minimum we need a license file included in the release) and then go through the normal staged-recipes -> coiled-runtime-feedstock process to get the runtime onto conda-forge.

Questions

  • Are the existing version pins what we want to include in the initial public release?

https://github.com/coiled/coiled-runtime/blob/96b876dbd1e950a42e50d6890ad72986d05e8611/continuous_integration/recipe/meta.yaml#L13-L44

My guess is we want to ping to an earlier release on dask + distributed from January since there have been deadlocks reported with more recent versions.

  • Packages on conda-forge need a license. What license should we use for the Coiled Runtime? Dask itself uses BSD-3 -- is that appropriate here too?
  • ...

Strategies for evaluating cluster stability during stability tests

In #84, the following strategy was applied for checking whether workers had deadlocked during computation

    fs = client.compute((ddf.x + ddf.y).mean())
    INTERVAL = 5
    start = time()

    while fs.status == "pending" and time() - start < 2 * 60:
        # We'd like to have seen all workers in the last INTERVAL seconds
        workers = list(client.scheduler_info()["workers"].values())
        assert all(time() - w["last_seen"]  < INTERVAL for w in workers)
        sleep(INTERVAL)

    wait(fs, timeout=0.1)   # Reify success or error

Alternative strategies might include:

  • Checking the distribution of worker heartbeats
  • Within a SchedulerPlugin for instance
  • It's probably worth doing a cluster dump if the test fails and producing this as a test artefact.

/cc @mrocklin, @fjetter, @jrbourbeau, @gjoseph92 if you have further ideas.

Reduce number of macos jobs

macos runners are limited and the current test matrix [["3.7", "3.8", "3.9"], ["0.0.3", "latest"]] provisions 6 of them which slows testing runs down.

Create GHA workflow for managing Coiled software environments

As we start testing coiled-runtime on Coiled clusters, we'll need to have Coiled software environments with various versions of coiled-runtime installed. We should create a GHA workflow which manages these software environments. This workflow should be separate from our existing workflows that run on PRs to avoid software environments being unintentionally changed when someone is working on a PR

Broken conda upload

Looking at the CI build for the latest commit (3d37618) we're failing to upload the coiled-runtime metapackage due to the auth token we're using lacking appropriate permissions

...
Using Anaconda API: https://api.anaconda.org/
Error:  ('Authentication token does not have the sufficient scope to perform this action expected: api:read', 401)
Error: Process completed with exit code 1.

cc @hayesgb who I believe made recent changes related to this

my env has wrong pandas version

This is probably user-error of some form, but since our other users will encounter it too I figured I'd file an issue.

I'm using this environment file:

name: integration-tests
channels:
  - coiled
  - conda-forge
dependencies:
  - coiled-runtime=0.0.3
  - pip
  - pip:
      - junitparser~=2.0.0
      - slack_sdk
      - coiled>=0.0.47
      - ddtrace
      - pytest-repeat~=0.9.1
      - pytest-json-report~=1.4.1

mamba env create -f integration-tests/environment.yml -n integration_tests13 gets me pandas 1.4.2, when I expected 1.3.5 (given the coiled-runtime pin).

There's not even a warning about something weird happening.

Conda behaves the same way, but at some point I stopped being able to repro the issue there.

mamba env create -f integration-tests/environment.yml -n integration_tests13
conda-forge/osx-64       Using cache
conda-forge/noarch       Using cache
pkgs/main/noarch         [====================] (00m:00s) No change
pkgs/r/osx-64            [====================] (00m:00s) No change
pkgs/main/osx-64         [====================] (00m:00s) No change
pkgs/r/noarch            [====================] (00m:00s) No change
coiled/osx-64            [====================] (00m:00s) No change
coiled/noarch            [====================] (00m:00s) No change


Looking for: ['altair', 'altair_saver', 'beautifulsoup4~=4.9.3', 'boto3', "s3fs[version='>=2021.8.0']", 'flaky~=3.7.0', 'git', 'nodejs~=14.15.1', "pandas[version='>=1.4.0']", 'pip', 'pre-commit', 'pytest-asyncio~=0.18.0', 'pytest-selenium~=2.0.1', 'pytest-timeout~=2.1.0', 'pytest-xdist~=2.5.0', 'pytest~=7.0.1', 'python-dotenv~=0.15.0', 'python=3.8', 'requests~=2.24.0', 'semver~=2.13.0', 'structlog~=20.2.0', 'yarn~=1.22.10', 'click', 'python-blosc=1.10.2', 'lz4=3.1.10', 'numpy=1.21.1', "cloudpickle[version='>=1.5.0']", "msgpack-python[version='>=0.6.0']", 'filelock']


Transaction

  Prefix: /Users/davidchudzicki/miniconda3/envs/integration_tests13

  Updating specs:

   - altair
   - altair_saver
   - beautifulsoup4 ~=4.9.3
   - boto3
   - s3fs[version='>=2021.8.0']
   - flaky ~=3.7.0
   - git
   - nodejs ~=14.15.1
   - pandas[version='>=1.4.0']
   - pip
   - pre-commit
   - pytest-asyncio ~=0.18.0
   - pytest-selenium ~=2.0.1
   - pytest-timeout ~=2.1.0
   - pytest-xdist ~=2.5.0
   - pytest ~=7.0.1
   - python-dotenv ~=0.15.0
   - python=3.8
   - requests ~=2.24.0
   - semver ~=2.13.0
   - structlog ~=20.2.0
   - yarn ~=1.22.10
   - click
   - python-blosc=1.10.2
   - lz4=3.1.10
   - numpy=1.21.1
   - cloudpickle[version='>=1.5.0']
   - msgpack-python[version='>=0.6.0']
   - filelock


  Package                           Version  Build               Channel                  Size
────────────────────────────────────────────────────────────────────────────────────────────────
  Install:
────────────────────────────────────────────────────────────────────────────────────────────────

  + aiobotocore                       2.2.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + aiohttp                           3.8.1  py38hed1de0f_1      conda-forge/osx-64     Cached
  + aioitertools                     0.10.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + aiosignal                         1.2.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + altair                            4.2.0  pyhd8ed1ab_1        conda-forge/noarch     Cached
  + altair_data_server                0.4.1  py_0                conda-forge/noarch     Cached
  + altair_saver                      0.5.0  py_0                conda-forge/noarch     Cached
  + altair_viewer                     0.4.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + async-timeout                     4.0.2  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + async_generator                    1.10  py_0                conda-forge/noarch     Cached
  + attrs                            21.4.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + beautifulsoup4                    4.9.3  pyhb0f4dca_0        conda-forge/noarch     Cached
  + blosc                            1.21.0  he49afe7_0          conda-forge/osx-64     Cached
  + boto3                           1.21.21  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + botocore                        1.24.21  pyhd8ed1ab_1        conda-forge/noarch     Cached
  + brotlipy                          0.7.0  py38hed1de0f_1004   conda-forge/osx-64     Cached
  + bzip2                             1.0.8  h0d85af4_4          conda-forge/osx-64     Cached
  + c-ares                           1.18.1  h0d85af4_0          conda-forge/osx-64     Cached
  + ca-certificates               2021.10.8  h033912b_0          conda-forge/osx-64     Cached
  + cairo                            1.16.0  he43a7df_1008       conda-forge/osx-64     Cached
  + certifi                       2021.10.8  py38h50d1736_2      conda-forge/osx-64     Cached
  + cffi                             1.15.0  py38h1a44b6c_0      conda-forge/osx-64     Cached
  + cfgv                              3.3.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + chardet                           3.0.4  py38h5347e94_1008   conda-forge/osx-64     Cached
  + charset-normalizer               2.0.12  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + click                             8.1.2  py38h50d1736_0      conda-forge/osx-64     Cached
  + cloudpickle                       2.0.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + cryptography                     36.0.2  py38h826b3c8_1      conda-forge/osx-64     Cached
  + curl                             7.82.0  h9f20792_0          conda-forge/osx-64     Cached
  + distlib                           0.3.4  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + entrypoints                         0.4  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + execnet                           1.9.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + expat                             2.4.8  h96cf925_0          conda-forge/osx-64     Cached
  + filelock                          3.6.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + flaky                             3.7.0  pyh9f0ad1d_0        conda-forge/noarch     Cached
  + font-ttf-dejavu-sans-mono          2.37  hab24e00_0          conda-forge/noarch     Cached
  + font-ttf-inconsolata              3.000  h77eed37_0          conda-forge/noarch     Cached
  + font-ttf-source-code-pro          2.038  h77eed37_0          conda-forge/noarch     Cached
  + font-ttf-ubuntu                    0.83  hab24e00_0          conda-forge/noarch     Cached
  + fontconfig                      2.13.94  h10f422b_0          conda-forge/osx-64     Cached
  + fonts-conda-ecosystem                 1  0                   conda-forge/noarch     Cached
  + fonts-conda-forge                     1  0                   conda-forge/noarch     Cached
  + freetype                         2.10.4  h4cff582_1          conda-forge/osx-64     Cached
  + fribidi                          1.0.10  hbcb3906_0          conda-forge/osx-64     Cached
  + frozenlist                        1.3.0  py38hed1de0f_1      conda-forge/osx-64     Cached
  + fsspec                         2022.3.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + gettext                        0.19.8.1  hd1a6beb_1008       conda-forge/osx-64     Cached
  + giflib                            5.2.1  hbcb3906_2          conda-forge/osx-64     Cached
  + git                              2.35.3  pl5321h33a4a8a_0    conda-forge/osx-64     Cached
  + graphite2                        1.3.13  h2e338ed_1001       conda-forge/osx-64     Cached
  + h11                              0.13.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + harfbuzz                          3.1.1  h159f659_0          conda-forge/osx-64     Cached
  + icu                                68.2  he49afe7_0          conda-forge/osx-64     Cached
  + identify                         2.4.12  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + idna                               2.10  pyh9f0ad1d_0        conda-forge/noarch     Cached
  + importlib-metadata               4.11.3  py38h50d1736_1      conda-forge/osx-64     Cached
  + importlib_resources               5.7.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + iniconfig                         1.1.1  pyh9f0ad1d_0        conda-forge/noarch     Cached
  + jinja2                            3.1.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + jmespath                          1.0.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + jpeg                                 9e  h0d85af4_0          conda-forge/osx-64     Cached
  + jsonschema                        4.4.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + krb5                             1.19.3  hb49756b_0          conda-forge/osx-64     Cached
  + libblas                           3.9.0  14_osx64_openblas   conda-forge/osx-64     Cached
  + libcblas                          3.9.0  14_osx64_openblas   conda-forge/osx-64     Cached
  + libcurl                          7.82.0  h9f20792_0          conda-forge/osx-64     Cached
  + libcxx                           13.0.1  hc203e6f_0          conda-forge/osx-64     Cached
  + libedit                    3.1.20191231  h0678c8f_2          conda-forge/osx-64     Cached
  + libev                              4.33  haf1e3a3_1          conda-forge/osx-64     Cached
  + libffi                            3.4.2  h0d85af4_5          conda-forge/osx-64     Cached
  + libgfortran                       5.0.0  9_3_0_h6c81a4c_23   conda-forge/osx-64     Cached
  + libgfortran5                      9.3.0  h6c81a4c_23         conda-forge/osx-64     Cached
  + libglib                          2.70.2  hf1fb8c0_4          conda-forge/osx-64     Cached
  + libiconv                           1.16  haf1e3a3_0          conda-forge/osx-64     Cached
  + liblapack                         3.9.0  14_osx64_openblas   conda-forge/osx-64     Cached
  + libnghttp2                       1.47.0  h942079c_0          conda-forge/osx-64     Cached
  + libopenblas                      0.3.20  openmp_hb3cd9ec_0   conda-forge/osx-64     Cached
  + libpng                           1.6.37  h7cec526_2          conda-forge/osx-64     Cached
  + libssh2                          1.10.0  h52ee1ee_2          conda-forge/osx-64     Cached
  + libuv                            1.41.1  h0d85af4_0          conda-forge/osx-64     Cached
  + libxml2                          2.9.12  h93ec3fd_0          conda-forge/osx-64     Cached
  + libzlib                          1.2.11  h6c3fc93_1014       conda-forge/osx-64     Cached
  + llvm-openmp                      13.0.1  hcb1a161_1          conda-forge/osx-64     Cached
  + lz4                              3.1.10  py38h5bcf07e_0      conda-forge/osx-64     Cached
  + lz4-c                             1.9.3  he49afe7_1          conda-forge/osx-64     Cached
  + markupsafe                        2.1.1  py38hed1de0f_1      conda-forge/osx-64     Cached
  + msgpack-python                    1.0.3  py38h8b7791e_1      conda-forge/osx-64     Cached
  + multidict                         6.0.2  py38hed1de0f_1      conda-forge/osx-64     Cached
  + ncurses                             6.3  h96cf925_1          conda-forge/osx-64     Cached
  + nodeenv                           1.6.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + nodejs                          14.15.4  hb529b34_1          conda-forge/osx-64     Cached
  + numpy                            1.21.1  py38had91d27_0      conda-forge/osx-64     Cached
  + openssl                          1.1.1n  h6c3fc93_0          conda-forge/osx-64     Cached
  + outcome                           1.1.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + packaging                          21.3  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pandas                            1.4.2  py38hb872667_1      conda-forge/osx-64     Cached
  + pango                           1.48.10  h056538c_2          conda-forge/osx-64     Cached
  + pcre                               8.45  he49afe7_0          conda-forge/osx-64     Cached
  + pcre2                             10.37  ha16e1b2_0          conda-forge/osx-64     Cached
  + perl                             5.32.1  2_h0d85af4_perl5    conda-forge/osx-64     Cached
  + pip                              22.0.4  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pixman                           0.40.0  hbcb3906_0          conda-forge/osx-64     Cached
  + platformdirs                      2.5.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pluggy                            1.0.0  py38h50d1736_3      conda-forge/osx-64     Cached
  + portpicker                        1.5.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pre-commit                       2.18.1  py38h50d1736_1      conda-forge/osx-64     Cached
  + psutil                            5.9.0  py38hed1de0f_1      conda-forge/osx-64     Cached
  + py                               1.11.0  pyh6c4a22f_0        conda-forge/noarch     Cached
  + pycparser                          2.21  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pyopenssl                        22.0.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pyparsing                         3.0.8  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pyrsistent                       0.18.1  py38hed1de0f_1      conda-forge/osx-64     Cached
  + pysocks                           1.7.1  py38h50d1736_5      conda-forge/osx-64     Cached
  + pytest                            7.0.1  py38h50d1736_0      conda-forge/osx-64     Cached
  + pytest-asyncio                   0.18.3  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pytest-base-url                   1.4.1  py_1                conda-forge/noarch     Cached
  + pytest-forked                     1.4.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pytest-html                       3.1.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pytest-metadata                   2.0.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pytest-selenium                   2.0.1  py_0                conda-forge/noarch     Cached
  + pytest-timeout                    2.1.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pytest-variables                  1.7.1  py_1                conda-forge/noarch     Cached
  + pytest-xdist                      2.5.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + python                           3.8.13  h394c593_0_cpython  conda-forge/osx-64     Cached
  + python-blosc                     1.10.2  py38ha53d530_2      conda-forge/osx-64     Cached
  + python-dateutil                   2.8.2  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + python-dotenv                    0.15.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + python_abi                          3.8  2_cp38              conda-forge/osx-64     Cached
  + pytz                             2022.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + pyyaml                              6.0  py38hed1de0f_4      conda-forge/osx-64     Cached
  + readline                            8.1  h05e3726_0          conda-forge/osx-64     Cached
  + requests                         2.24.0  pyh9f0ad1d_0        conda-forge/noarch     Cached
  + s3fs                           2022.3.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + s3transfer                        0.5.2  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + selenium                          4.1.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + semver                           2.13.0  pyh9f0ad1d_0        conda-forge/noarch     Cached
  + setuptools                       62.1.0  py38h50d1736_0      conda-forge/osx-64     Cached
  + six                              1.16.0  pyh6c4a22f_0        conda-forge/noarch     Cached
  + sniffio                           1.2.0  py38h50d1736_3      conda-forge/osx-64     Cached
  + sortedcontainers                  2.4.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + soupsieve                         2.3.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + sqlite                           3.38.2  hb516253_0          conda-forge/osx-64     Cached
  + structlog                        20.2.0  pyhd3deb0d_0        conda-forge/noarch     Cached
  + tenacity                          6.3.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + tk                               8.6.12  h5dbffcc_0          conda-forge/osx-64     Cached
  + toml                             0.10.2  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + tomli                             2.0.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + toolz                            0.11.2  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + tornado                             6.1  py38hed1de0f_3      conda-forge/osx-64     Cached
  + trio                             0.20.0  py38h50d1736_1      conda-forge/osx-64     Cached
  + trio-websocket                    0.9.2  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + typing-extensions                 4.2.0  hd8ed1ab_0          conda-forge/noarch     Cached
  + typing_extensions                 4.2.0  pyha770c72_0        conda-forge/noarch     Cached
  + ukkonen                           1.0.1  py38h8b7791e_2      conda-forge/osx-64     Cached
  + urllib3                         1.25.11  py_0                conda-forge/noarch     Cached
  + vega-cli                         5.17.0  ha93a217_3          conda-forge/osx-64     Cached
  + vega-lite-cli                    4.17.0  hf542fc5_1          conda-forge/osx-64     Cached
  + virtualenv                      20.14.1  py38h50d1736_0      conda-forge/osx-64     Cached
  + wheel                            0.37.1  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + wrapt                            1.14.0  py38hed1de0f_1      conda-forge/osx-64     Cached
  + wsproto                           1.1.0  py38h50d1736_0      conda-forge/osx-64     Cached
  + xz                                5.2.5  haf1e3a3_1          conda-forge/osx-64     Cached
  + yaml                              0.2.5  h0d85af4_2          conda-forge/osx-64     Cached
  + yarl                              1.7.2  py38hed1de0f_2      conda-forge/osx-64     Cached
  + yarn                            1.22.18  h694c41f_0          conda-forge/osx-64     Cached
  + zipp                              3.8.0  pyhd8ed1ab_0        conda-forge/noarch     Cached
  + zlib                             1.2.11  h6c3fc93_1014       conda-forge/osx-64     Cached

  Summary:

  Install: 167 packages

  Total download: 0  B

────────────────────────────────────────────────────────────────────────────────────────────────

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Installing pip dependencies: | Ran pip subprocess with arguments:
['/Users/davidchudzicki/miniconda3/envs/integration_tests13/bin/python', '-m', 'pip', 'install', '-U', '-r', '/Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt']
Pip subprocess output:
Collecting junitparser~=2.0.0
  Using cached junitparser-2.0.0-py2.py3-none-any.whl (10 kB)
Collecting slack_sdk
  Using cached slack_sdk-3.15.2-py2.py3-none-any.whl (261 kB)
Collecting coiled>=0.0.47
  Using cached coiled-0.0.74-py3-none-any.whl (139 kB)
Collecting ddtrace
  Using cached ddtrace-1.0.1-cp38-cp38-macosx_10_9_x86_64.whl (1.4 MB)
Collecting pytest-repeat~=0.9.1
  Using cached pytest_repeat-0.9.1-py2.py3-none-any.whl (4.3 kB)
Collecting pytest-json-report~=1.4.1
  Using cached pytest_json_report-1.4.1-py3-none-any.whl (12 kB)
Collecting future
  Using cached future-0.18.2-py3-none-any.whl
Collecting backoff>=1.10.0
  Using cached backoff-1.11.1-py2.py3-none-any.whl (13 kB)
Requirement already satisfied: aiobotocore>=1.1.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.2.0)
Requirement already satisfied: pandas>=1.1.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.4.2)
Requirement already satisfied: jmespath in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.0.0)
Collecting ipython
  Using cached ipython-8.2.0-py3-none-any.whl (750 kB)
Collecting click<=8.0,>=7.1
  Using cached click-8.0.0-py3-none-any.whl (96 kB)
Requirement already satisfied: aiohttp in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (3.8.1)
Collecting distributed>=2.23.0
  Using cached distributed-2022.4.1-py3-none-any.whl (855 kB)
Requirement already satisfied: typing-extensions in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (4.2.0)
Requirement already satisfied: boto3 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.21.21)
Requirement already satisfied: jinja2 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (3.1.1)
Requirement already satisfied: s3fs in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2022.3.0)
Collecting dask[complete]>=2.23.0
  Using cached dask-2022.4.1-py3-none-any.whl (1.1 MB)
Collecting rich>=11.2.0
  Using cached rich-12.2.0-py3-none-any.whl (229 kB)
Requirement already satisfied: six>=1.12.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from ddtrace->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 4)) (1.16.0)
Requirement already satisfied: packaging>=17.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from ddtrace->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 4)) (21.3)
Requirement already satisfied: attrs>=19.2.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from ddtrace->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 4)) (21.4.0)
Collecting protobuf>=3
  Using cached protobuf-3.20.0-cp38-cp38-macosx_10_9_x86_64.whl (962 kB)
Requirement already satisfied: tenacity>=5 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from ddtrace->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 4)) (6.3.1)
Requirement already satisfied: pytest>=3.6 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pytest-repeat~=0.9.1->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 5)) (7.0.1)
Requirement already satisfied: pytest-metadata in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pytest-json-report~=1.4.1->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 6)) (2.0.1)
Requirement already satisfied: wrapt>=1.10.10 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiobotocore>=1.1.1->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.14.0)
Requirement already satisfied: botocore<1.24.22,>=1.24.21 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiobotocore>=1.1.1->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.24.21)
Requirement already satisfied: aioitertools>=0.5.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiobotocore>=1.1.1->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (0.10.0)
Requirement already satisfied: frozenlist>=1.1.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.3.0)
Requirement already satisfied: yarl<2.0,>=1.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.7.2)
Requirement already satisfied: charset-normalizer<3.0,>=2.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.0.12)
Requirement already satisfied: multidict<7.0,>=4.5 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (6.0.2)
Requirement already satisfied: aiosignal>=1.1.2 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.2.0)
Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (4.0.2)
Requirement already satisfied: cloudpickle>=1.1.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from dask[complete]>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.0.0)
Requirement already satisfied: toolz>=0.8.2 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from dask[complete]>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (0.11.2)
Requirement already satisfied: fsspec>=0.6.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from dask[complete]>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2022.3.0)
Collecting partd>=0.3.10
  Using cached partd-1.2.0-py3-none-any.whl (19 kB)
Requirement already satisfied: pyyaml>=5.3.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from dask[complete]>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (6.0)
Collecting bokeh>=2.4.2
  Using cached bokeh-2.4.2-py3-none-any.whl (18.5 MB)
Requirement already satisfied: numpy>=1.18 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from dask[complete]>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.21.1)
Requirement already satisfied: msgpack>=0.6.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from distributed>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.0.3)
Requirement already satisfied: psutil>=5.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from distributed>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (5.9.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from distributed>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.4.0)
Collecting zict>=0.1.3
  Using cached zict-2.1.0-py3-none-any.whl (11 kB)
Requirement already satisfied: tornado>=6.0.3 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from distributed>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (6.1)
Collecting tblib>=1.6.0
  Using cached tblib-1.7.0-py2.py3-none-any.whl (12 kB)
Requirement already satisfied: urllib3 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from distributed>=2.23.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (1.25.11)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from packaging>=17.1->ddtrace->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 4)) (3.0.8)
Requirement already satisfied: python-dateutil>=2.8.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pandas>=1.1.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pandas>=1.1.0->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2022.1)
Requirement already satisfied: iniconfig in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pytest>=3.6->pytest-repeat~=0.9.1->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 5)) (1.1.1)
Requirement already satisfied: pluggy<2.0,>=0.12 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pytest>=3.6->pytest-repeat~=0.9.1->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 5)) (1.0.0)
Requirement already satisfied: py>=1.8.2 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pytest>=3.6->pytest-repeat~=0.9.1->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 5)) (1.11.0)
Requirement already satisfied: tomli>=1.0.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from pytest>=3.6->pytest-repeat~=0.9.1->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 5)) (2.0.1)
Collecting pygments<3.0.0,>=2.6.0
  Using cached Pygments-2.11.2-py3-none-any.whl (1.1 MB)
Collecting commonmark<0.10.0,>=0.9.0
  Using cached commonmark-0.9.1-py2.py3-none-any.whl (51 kB)
Requirement already satisfied: s3transfer<0.6.0,>=0.5.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from boto3->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (0.5.2)
Collecting traitlets>=5
  Using cached traitlets-5.1.1-py3-none-any.whl (102 kB)
Collecting jedi>=0.16
  Using cached jedi-0.18.1-py2.py3-none-any.whl (1.6 MB)
Collecting prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0
  Using cached prompt_toolkit-3.0.29-py3-none-any.whl (381 kB)
Collecting pickleshare
  Using cached pickleshare-0.7.5-py2.py3-none-any.whl (6.9 kB)
Collecting stack-data
  Using cached stack_data-0.2.0-py3-none-any.whl (21 kB)
Collecting appnope
  Using cached appnope-0.1.3-py2.py3-none-any.whl (4.4 kB)
Requirement already satisfied: setuptools>=18.5 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from ipython->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (62.1.0)
Collecting matplotlib-inline
  Using cached matplotlib_inline-0.1.3-py3-none-any.whl (8.2 kB)
Collecting pexpect>4.3
  Using cached pexpect-4.8.0-py2.py3-none-any.whl (59 kB)
Collecting decorator
  Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB)
Collecting backcall
  Using cached backcall-0.2.0-py2.py3-none-any.whl (11 kB)
Requirement already satisfied: MarkupSafe>=2.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from jinja2->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.1.1)
Collecting pillow>=7.1.0
  Using cached Pillow-9.1.0-cp38-cp38-macosx_10_9_x86_64.whl (3.1 MB)
Collecting parso<0.9.0,>=0.8.0
  Using cached parso-0.8.3-py2.py3-none-any.whl (100 kB)
Collecting locket
  Using cached locket-0.2.1-py2.py3-none-any.whl (4.1 kB)
Collecting ptyprocess>=0.5
  Using cached ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Requirement already satisfied: idna>=2.0 in /Users/davidchudzicki/miniconda3/envs/integration_tests13/lib/python3.8/site-packages (from yarl<2.0,>=1.0->aiohttp->coiled>=0.0.47->-r /Users/davidchudzicki/integration-tests/condaenv.y095ykln.requirements.txt (line 3)) (2.10)
Collecting heapdict
  Using cached HeapDict-1.0.1-py3-none-any.whl (3.9 kB)
Collecting pure-eval
  Using cached pure_eval-0.2.2-py3-none-any.whl (11 kB)
Collecting asttokens
  Using cached asttokens-2.0.5-py2.py3-none-any.whl (20 kB)
Collecting executing
  Using cached executing-0.8.3-py2.py3-none-any.whl (16 kB)
Installing collected packages: wcwidth, pure-eval, ptyprocess, pickleshare, heapdict, executing, commonmark, backcall, appnope, zict, traitlets, tblib, slack_sdk, pygments, protobuf, prompt-toolkit, pillow, pexpect, parso, locket, future, decorator, click, backoff, asttokens, stack-data, rich, partd, matplotlib-inline, junitparser, jedi, ddtrace, bokeh, pytest-repeat, ipython, dask, pytest-json-report, distributed, coiled
  Attempting uninstall: click
    Found existing installation: click 8.1.2
    Uninstalling click-8.1.2:
      Successfully uninstalled click-8.1.2
Successfully installed appnope-0.1.3 asttokens-2.0.5 backcall-0.2.0 backoff-1.11.1 bokeh-2.4.2 click-8.0.0 coiled-0.0.74 commonmark-0.9.1 dask-2022.4.1 ddtrace-1.0.1 decorator-5.1.1 distributed-2022.4.1 executing-0.8.3 future-0.18.2 heapdict-1.0.1 ipython-8.2.0 jedi-0.18.1 junitparser-2.0.0 locket-0.2.1 matplotlib-inline-0.1.3 parso-0.8.3 partd-1.2.0 pexpect-4.8.0 pickleshare-0.7.5 pillow-9.1.0 prompt-toolkit-3.0.29 protobuf-3.20.0 ptyprocess-0.7.0 pure-eval-0.2.2 pygments-2.11.2 pytest-json-report-1.4.1 pytest-repeat-0.9.1 rich-12.2.0 slack_sdk-3.15.2 stack-data-0.2.0 tblib-1.7.0 traitlets-5.1.1 wcwidth-0.2.5 zict-2.1.0

done
#
# To activate this environment, use
#
#     $ conda activate integration_tests13
#
# To deactivate an active environment, use
#
#     $ conda deactivate

(integration_tests12)  /Users/davidchudzicki $

Raise issue when CI on `main` fails

If CI fails for some reason on a push to main, it'd be good for us to have that visible (today there's just a red X that most folks won't notice). Dask's upstream CI build opens an issue when there's a failure on main and leaves a comment linking to the corresponding CI build. We can reuse that GHA workflow here.

Pre-release versions of coiled-runtime

It would be very helpful for us to have pre-releases of coiled-runtime available, either cut directly from dask main or a hand selected more recent version.

We could use this internally at coiled on our staging deployment as well to test things before promoting it to a full release

Cannot install conda-runtime on Ubuntu 20.04 using Conda/Mamba

I've been trying to install the conda-runtime on Ubuntu 20.04 but run into the following issues when trying to install it in a Python 3.8 conda environment

Conda

$ conda create -n crt
$ conda activate crt
(crt) $ conda install -c coiled coiled-runtime
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: - 
Found conflicts! Looking for incompatible packages.
This can take several minutes.  Press CTRL-C to abort.
failed                                                                                                                                                                                                     

UnsatisfiableError: 

(crt) $ python
Python 3.8.10 (default, Mar 15 2022, 12:22:08) 
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import coiled
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'coiled'
>>> 

Mamba

$ mamba create -n crt
$ mamba activate crt
(crt) $ mamba install -c coiled coiled-runtime

                  __    __    __    __
                 /  \  /  \  /  \  /  \
                /    \/    \/    \/    \
███████████████/  /██/  /██/  /██/  /████████████████████████
              /  / \   / \   / \   / \  \____
             /  /   \_/   \_/   \_/   \    o \__,
            / _/                       \_____/  `
            |/
        ███╗   ███╗ █████╗ ███╗   ███╗██████╗  █████╗
        ████╗ ████║██╔══██╗████╗ ████║██╔══██╗██╔══██╗
        ██╔████╔██║███████║██╔████╔██║██████╔╝███████║
        ██║╚██╔╝██║██╔══██║██║╚██╔╝██║██╔══██╗██╔══██║
        ██║ ╚═╝ ██║██║  ██║██║ ╚═╝ ██║██████╔╝██║  ██║
        ╚═╝     ╚═╝╚═╝  ╚═╝╚═╝     ╚═╝╚═════╝ ╚═╝  ╚═╝

        mamba (0.15.3) supported by @QuantStack

        GitHub:  https://github.com/mamba-org/mamba
        Twitter: https://twitter.com/QuantStack

█████████████████████████████████████████████████████████████


Looking for: ['coiled-runtime']

pkgs/r/linux-64          [====================] (00m:00s) No change
pkgs/main/linux-64       [====================] (00m:00s) No change
pkgs/main/noarch         [====================] (00m:00s) No change
pkgs/r/noarch            [====================] (00m:00s) No change
coiled/linux-64          [====================] (00m:00s) No change
coiled/noarch            [====================] (00m:00s) No change
Encountered problems while solving:
  - nothing provides coiled needed by coiled-runtime-0.0.3-py_0

(crt) simon@sjp-tpe15:~$ python
Python 3.8.10 (default, Mar 15 2022, 12:22:08) 
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import coiled
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'coiled'
>>> 

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.