Code Monkey home page Code Monkey logo

pixi's People

Contributors

bahugo avatar baszalmstra avatar chawyehsu avatar dependabot[bot] avatar hadim avatar haozeke avatar hofer-julian avatar inbinder avatar jiaxiyang avatar nichmor avatar orhun avatar pablovela5620 avatar partrita avatar pavelzw avatar raulpl avatar ruben-arts avatar seaotocinclus avatar spenserblack avatar sumanth-manchala avatar tdejager avatar traversaro avatar travishathaway avatar trueleo avatar tylere avatar vlad-ivanov-name avatar wackyator avatar williamjamir avatar wolfv avatar yarikoptic avatar yyyasin19 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pixi's Issues

Add convenience command to add something to `[commands]`

Not sure about this feature request, maybe it's stupid, just an idea.

Would be nice to have something like add-command that you can use when you're happy with something you prototyped with run:

px run super dooper
# OK this is cool, let's add it to the commands
px add-command mycmd super dooper
px run mycmd

Add "commands"/"scripts"

Just like with npm, add a section in a project manifest that lists several shortcut commands. E.g.

[project]
name = "myproject"

[commands]
test = "pytest ."

Invoking the command

pax run test

will execute the command pytest . in the environment.

Add `update` command

Add a command to update one or more dependencies in the pixi.lock file without updating the project file. If the lock file is missing, one will be created.

Support specifying channel per package (e.g. `<channel>::<package>`)

Would be nice if we can pull certain packages from certain channels using the conda-canary/label/dev::conda syntax.

Perhaps a better syntax for us would be:

conda = { version = "*", channel = "conda-canary/label/dev" }

This requires that we parse dependencies a little differently.

Add `metadata` command

Add a command to get metadata information from the project in a machine-readable format.

The extractable data should include the following:

  • Package metadata like the version, name, and authors.
  • The dependencies of the project.
  • The locked dependencies of the project.
  • Location of the conda environment

We could either format the data simply as JSON or we could use go templates to make it really customizable (there is a crate for that: https://github.com/fiji-flo/gtmpl-rust). Go templates are used by for instance docker or kubernetes.

Why?

To make it easy for other tools to ingest information about the project. For instance, one might want to extract the version of the project to be able to use in during a build step in CMake. Or one might want to extract the information in a CI stage.

Inspiration

Expose pixi metadata as environment variables

It would be nice if there were environment variables exposed about the configuration that you are running in. For instance:

$PIXI_PACKAGE_VERSION
$PIXI_PACKAGE_NAME
$PIXI_PACKAGE_CHANNELS
$PIXI_PACKAGE_ROOT_DIR

Support virtual packages

Virtual packages are currently not supported at all. This is a problem when you want to work with GPU-based packages.

Ideally, the tool provides a baseline set of virtual packages per platform that are used as the virtual packages for the solver. The user is able to overwrite virtual packages by adding them to the project manifest. Something along the lines of:

[project]
name = "myproject"

[virtual_packages]
__cuda = "12.1"

Another step that needs to be added is that the tool needs to check if the system of the user is actually compatible with the minimum requirements specified in the project manifest. The tool should probably error out if that is not the case.

Check lockfile multiple platforms.

Currently when checking if the lockfile is up to date we simply check if any of the packages in the lockfile match the matchspec of a package in the project manifest. This is incorrect, it should do this for all supported platforms individually.

Add "dev-dependencies" and more

Besides "dependencies" we should also add "dev-dependencies". These packages are also installed in the environment but they are not added to the conda package as dependencies.

One tricky thing with dev-dependencies is that they can add additional constraints on the normal dependencies. This could cause installing a lower version of a dependency during development whereas when the package is installed as a conda package we would get a higher version of a dependency. This could cause all sorts of "works-on-my-machine" issues.

We could choose to first resolve the normal dependencies and pin them, followed by resolving the dev dependencies. If a dev-dependency would require a lower version of a package the resolution would fail. Also not ideal but it might be more transparent than the alternative.

Animated SVG slows down Safari

Screenshot 2023-06-14 at 09 40 48

This is when using latest Safari on latest macOS to view the repo home/readme with MacBook Pro M2 Pro.

It really is unbearable. I resorting to reading the Markdown sources without rendering.

Add a `remove` command

Add a command to remove a dependency from the project file. Basically the opposite of the add command.

cargo install from source fails

Hey,

using the pre-built binary worked perfectly but running cargo install --git https://github.com/prefix-dev/pixi.git caused the following output

   Compiling pixi v0.0.2 (/Users/ytatar/.cargo/git/checkouts/pixi-2602b418e59cbe0c/3e10b5a)
error[E0308]: mismatched types
   --> src/repodata.rs:129:9
    |
127 |     let result = fetch::fetch_repo_data(
    |                  ---------------------- arguments to this function are incorrect
128 |         channel.platform_url(platform),
129 |         client,
    |         ^^^^^^ expected struct `rattler_networking::AuthenticatedClient`, found struct `reqwest::Client`
    |
note: function defined here
   --> /Users/ytatar/.cargo/git/checkouts/rattler-3fdeff0ba0214908/a697958/crates/rattler_repodata_gateway/src/fetch/mod.rs:249:14
    |
249 | pub async fn fetch_repo_data(
    |              ^^^^^^^^^^^^^^^

error[E0308]: mismatched types
   --> src/environment.rs:538:21
    |
535 |                 .get_or_fetch_from_url(
    |                  --------------------- arguments to this function are incorrect
...
538 |                     download_client.clone(),
    |                     ^^^^^^^^^^^^^^^^^^^^^^^ expected struct `rattler_networking::AuthenticatedClient`, found struct `reqwest::Client`
    |
note: associated function defined here
   --> /Users/ytatar/.cargo/git/checkouts/rattler-3fdeff0ba0214908/a697958/crates/rattler/src/package_cache.rs:179:18
    |
179 |     pub async fn get_or_fetch_from_url(
    |                  ^^^^^^^^^^^^^^^^^^^^^

For more information about this error, try `rustc --explain E0308`.
error: could not compile `pixi` due to 2 previous errors
error: failed to compile `pixi v0.0.2 (https://github.com/prefix-dev/pixi.git#3e10b5a5)`, intermediate artifacts can be found at `/var/folders/jb/1wt_13hj6hnfb8zry_121wcc0000gn/T/cargo-installAiuAH0`

Am I maybe misunderstanding something regarding the installation instructions?

Platform specific configuration

Multi-platform packages will require the need to have platform-specific dependencies or configurations.

Cargo supports this through the [target] section.

How to format them in TOML?

[dependencies]
bla = "3.1.*"
windows_specific = { version = "3.1.*", only_windows = true }

or cargos approach:

[dependencies]
bla = "3.1.*"

[target.'sel(win)'.dependencies]
windows_specific = "3.1"

I like the latter the most because its clearer that this is an exception that only applies in a certain scenario.

Selectors or platforms?

Cargo uses the cfg(..) syntax in its selectors. I think this is pretty cool because it allows you to do more complex things like selecting for two platforms or for specific architectures (arm vs x86).

We could also simplify things a little bit and only use platform names as selectors e.g. osx-arm64 vs sel(osx). Cargo also supports specifying the target directly.

I would opt to go for the full selector experience because it provides a lot of flexibility.

Todo

  • Figure out our approach
  • Implement it

Reachable variables in the commands

I was thinking of a custom command to update versions of all files in the project but for that i would need the version of the pax.toml

What would be nice if you could access the project files and possibly some more things in your commands.
eg.

[project]
name = "pax"
version = "0.1.0"
description = "Add a short description here"
authors = ["Ruben Arts <[email protected]>"]
channels = ["conda-forge"]
platforms = ["linux-64", "win-64", "osx-64", "osx-arm64"]

[commands]
custom_command = "echo project_name: $PROJECT_NAME version: $PROJECT_VERSION desc: $PROJECT_DESCRIPTION"

and possibly more info about the config you give.

Add a proper README and LICENSES

We need to have a proper readme with licenses!

  • If we have a logo it should be in there.
  • There should be build, docs badges
  • A good text explaining what this is
  • Examples
  • Should we use the same license as rattler? (BSD-3)

Support `pyproject.toml` as an alternative to `pixi.toml`

In python world it would be far from ideal to need both a pyproject.toml and a pixi.toml. Ideally pixi would also be able to read a pyproject.toml file and use that as a project file.

Some care needs to be taken in the distinction between conda and pypi packages. Since the pyproject.toml originally specifies pypi packages I think we would need to add another section for the conda packages.

[tool.pixi.dependencies]
some_conda_package = "*"

Reconstruct `RepoDataRecord` from lock file

We might want to rethink the conda-lock file format a little because in itself it is not entirely enough to produce the same environment as using just conda. In fact, some crucial information required during installation is missing (spoiler: noarch).

Ideally, we would be able to recreate the RepoDataRecord struct for every package in the conda-lock file. This struct contains all the information you would expect in the conda-meta folder of an environment and a bunch of these fields are required to properly install packages. However, the current conda-lock file format doesn't include all the fields present in either the RepodataRecord nor the PackageRecord (aka, the think that resides in the repodata.json). Fields like license, arch, noarch, platform and subdir are missing. Although we can probably reconstruct some of this information from the subdir name we cant reconstruct all. Especially the missing noarch might be problem (is it a python noarch package or not for instance).

See also this related issue: conda/conda-lock#363 (comment)

Ideas for improving the CLI output

  1. There are a couple of progress bars shown that I would like to be removed once the operation finishes. For instance, when fetching the repodata there are multiple progress bars, one for each repodata location. I would prefer to see them as the download progresses, but once all repodata has been successfully fetched they should be removed. I don't think that for the average user, this information is very useful at all if the download succeeds.

In general, reducing the total output of the tool should be something to strive for.

Get rid of the "sync" command

Currently, when you want to do something with the environment created by the tool you have to run pax sync. The sync command will update the lock file and the conda environment.

It has the following steps:

  1. Read the current lockfile
  2. Read the project manifest
  3. Check if the dependencies in the lockfile match the specs in the project manifest.
  4. If any of the matchspecs in the project manifest do not have a package in the lockfile that satisfies it, we update the lockfile.
  5. Read the packages in the environment
  6. Create a transaction to update the environment to the packages in the lockfile (for the current platform). This might be empty.
  7. Apply the transaction.

The sync command is a fragile thing, if you forget to run the sync command you might run a command from an "old" environment which might be hard to notice.

Ideally, we get rid of the command completely and execute the above steps when working with the environment. The following commands should be affected.

  • pax add
  • pax run

Special care has to be taken with the run command because we should strive to make the overhead of the run command as small as possible. The reason is that if you have programs like formatters/linters that themselves only run for a short period of time adding a large overhead would kill productivity. We should strive to keep the overhead <100ms.

Add pixi to homebrew

For a project to be added to homebrew-core, it must be over 2 months(?) old and have >= 100 70 75 stars.
Until then I could build a recipe and host it on my own tap and migrate it when all requirements are met.

Add a `info` command

As a user I would like to get the info that is being read from my system by the tool to easily debug issues. This is also helpful for future issue resolving where users could print out so it gets easier to investigate.

conda info is a good example:

     active environment : None
            shell level : 0
       user config file : /home/rarts/.condarc
 populated config files : /usr/share/conda/condarc.d/defaults.yaml
          conda version : 4.13.0
    conda-build version : not installed
         python version : 3.11.3.final.0
       virtual packages : __cuda=12.1=0
                          __linux=6.2.15=0
                          __glibc=2.36=0
                          __unix=0=0
                          __archspec=1=x86_64
       base environment : /usr  (read only)
      conda av data dir : /usr/etc/conda
  conda av metadata url : None
           channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
          package cache : /var/cache/conda/pkgs
                          /home/rarts/.conda/pkgs
       envs directories : /home/rarts/.conda/envs
                          /usr/envs
               platform : linux-64
             user-agent : conda/4.13.0 requests/2.28.1 CPython/3.11.3 Linux/6.2.15-200.fc37.x86_64 fedora/37 glibc/2.36
                UID:GID : 1000:1000
             netrc file : None
           offline mode : False

We could possibly extend this with project information

Incremental lockfile updates

When the lockfile is out of date it is completely regenerated. We should change this behavior so that the content of the previous lockfile resides in the SolverTask::locked_packages. This will make sure that package versions remain stable unless this would result in an unsolvable situation.

Test the generated binaries

Check if the binaries that are created in CI actually work.

Basically download them and run them to see if they work!

`add` command: add the same package twice but with different version.

I want to "readd" a package but in a different version:

pax add python
✔ Added python 3.11.*

# Ah damn I wanted a different version
pax add python=3.8

Right now you would get an error saying:

➜ pax add python=3.11
error: could not determine any available versions for python on linux-64. Either the package could not be found or version constraints on other dependencies result in a conflict.

Caused by:
    unsolvable

Document installation from binaries

Thanks a lot for working and open sourcing pixi!

It would be nice if the installation from binaries (that are available in the GitHub release) was documented in the README, for people that are not familiar with Rust/Cargo.

👷‍♀ Add the ability to "build" a conda package

This probably requires some extra experimentation.

As a building block, we should add the ability to build a conda package from a project. If we can always construct a conda package from a pixi project we can enable source builds pretty easily by building any source package into a conda package first.

I envision a command called

pixi build

that (on success) constructs a new package archive containing the build output of the project. Building the project simply calls the "build" command/script (#11) and passes in an environment variable or replaces something in the "build" command/script (with minijinja or whatever). The responsibility of outputting the correct build files is left up to the user.

I don't know how to handle creating buildstrings, multiple outputs, binary/text patching, etc. Probably have to investigate that a little more.

Add the `install` command (shims)

We would like to be able to install conda packages on your system like condax. This can then function as a replacement to scoop/homebrew/npm.

pax install cowpy

Will create an environment called cowpy (or whatever) and a binary is added: ~/.pax/bin/cowpy (the shim). The shim file, when executed, will source the environment and execute cowpy in there.

After running the above command, this should just work:

cowpy

Note that this is not tied to a project or environment. It "installs" the cowpy command globally.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.