prefix-dev / pixi Goto Github PK
View Code? Open in Web Editor NEWPackage management made easy
Home Page: https://pixi.sh
License: BSD 3-Clause "New" or "Revised" License
Package management made easy
Home Page: https://pixi.sh
License: BSD 3-Clause "New" or "Revised" License
There is a utility struct called Prefix
in the current repository. It contains high level code to interact with a conda environment on disk. It would be nice to move this to the rattler
crate and add additional functionalities to it (like executing a transaction on it).
https://github.com/prefix-dev/pax/blob/7231f7ccccdb39f7e75dc3059aa6fd3abdf402ee/src/prefix.rs#L7-L8
Not sure about this feature request, maybe it's stupid, just an idea.
Would be nice to have something like add-command
that you can use when you're happy with something you prototyped with run
:
px run super dooper
# OK this is cool, let's add it to the commands
px add-command mycmd super dooper
px run mycmd
Allow users to define virtual packages in the project configuration.
currently run only supports a single argument. We need to capture extra arguments and pass them to the invocation. This requires proper quoting in the activation scripts!
This doesn't work
pax run python --version
this does:
pax run python
See: https://docs.rs/clap/latest/clap/struct.Arg.html#method.trailing_var_arg
We have to have a place where people can find documentation on how to use the tool.
Examples:
Add a command to remove a globally installed package. Basically the opposite of install
.
We could create a simple windows installer for pixi
. That way people that are less comfortable with Cargo or the command line (students?) could very easily install the tool.
We could use https://github.com/volks73/cargo-wix for this.
When I'm in fish and use pixi completion
, does it make sense to have an implicit -s fish
?
Add a shell
command to spawn a subshell with an activated Conda environment.
Why
To avoid some of the overhead introduced by consecutive run
invocations. And to not have to type pixi run
everytime.
Inspiration
The tool contains code to check if a LockedDependency
matches a NamelessMatchSpec
this should be move to rattler itself.
Just like with npm, add a section in a project manifest that lists several shortcut commands. E.g.
[project]
name = "myproject"
[commands]
test = "pytest ."
Invoking the command
pax run test
will execute the command pytest .
in the environment.
Add a command to update one or more dependencies in the pixi.lock
file without updating the project file. If the lock file is missing, one will be created.
Would be nice if we can pull certain packages from certain channels using the conda-canary/label/dev::conda
syntax.
Perhaps a better syntax for us would be:
conda = { version = "*", channel = "conda-canary/label/dev" }
This requires that we parse dependencies a little differently.
Add a login
or auth
command to be able to use the prefix.dev private channels.
When I do pixi add python pytest
it installs python and pytest.
When I don't need pytest
anymore and remove it from the pixi.toml
and then run pixi
to update the env it doesn't get removed.
I expect it to find the dependencies that don't have a root dependency anymore and delete those from the env.
Add a command to get metadata information from the project in a machine-readable format.
The extractable data should include the following:
We could either format the data simply as JSON or we could use go templates to make it really customizable (there is a crate for that: https://github.com/fiji-flo/gtmpl-rust). Go templates are used by for instance docker
or kubernetes
.
Why?
To make it easy for other tools to ingest information about the project. For instance, one might want to extract the version of the project to be able to use in during a build step in CMake. Or one might want to extract the information in a CI stage.
Inspiration
Currently running:
pax add boa
pax add python
results in an unsolvable state. This is because when adding python
we try to add the highest version which is incompatible with boa
.
It would be nice if there were environment variables exposed about the configuration that you are running in. For instance:
$PIXI_PACKAGE_VERSION
$PIXI_PACKAGE_NAME
$PIXI_PACKAGE_CHANNELS
$PIXI_PACKAGE_ROOT_DIR
Virtual packages are currently not supported at all. This is a problem when you want to work with GPU-based packages.
Ideally, the tool provides a baseline set of virtual packages per platform that are used as the virtual packages for the solver. The user is able to overwrite virtual packages by adding them to the project manifest. Something along the lines of:
[project]
name = "myproject"
[virtual_packages]
__cuda = "12.1"
Another step that needs to be added is that the tool needs to check if the system of the user is actually compatible with the minimum requirements specified in the project manifest. The tool should probably error out if that is not the case.
Currently when checking if the lockfile is up to date we simply check if any of the packages in the lockfile match the matchspec of a package in the project manifest. This is incorrect, it should do this for all supported platforms individually.
When running a command of an installed package multiple arguments are considered a single argument on windows.
Besides "dependencies" we should also add "dev-dependencies". These packages are also installed in the environment but they are not added to the conda package as dependencies.
One tricky thing with dev-dependencies is that they can add additional constraints on the normal dependencies. This could cause installing a lower version of a dependency during development whereas when the package is installed as a conda package we would get a higher version of a dependency. This could cause all sorts of "works-on-my-machine" issues.
We could choose to first resolve the normal dependencies and pin them, followed by resolving the dev dependencies. If a dev-dependency would require a lower version of a package the resolution would fail. Also not ideal but it might be more transparent than the alternative.
Add a command to remove
a dependency from the project file. Basically the opposite of the add
command.
Hey,
using the pre-built binary worked perfectly but running cargo install --git https://github.com/prefix-dev/pixi.git
caused the following output
Compiling pixi v0.0.2 (/Users/ytatar/.cargo/git/checkouts/pixi-2602b418e59cbe0c/3e10b5a)
error[E0308]: mismatched types
--> src/repodata.rs:129:9
|
127 | let result = fetch::fetch_repo_data(
| ---------------------- arguments to this function are incorrect
128 | channel.platform_url(platform),
129 | client,
| ^^^^^^ expected struct `rattler_networking::AuthenticatedClient`, found struct `reqwest::Client`
|
note: function defined here
--> /Users/ytatar/.cargo/git/checkouts/rattler-3fdeff0ba0214908/a697958/crates/rattler_repodata_gateway/src/fetch/mod.rs:249:14
|
249 | pub async fn fetch_repo_data(
| ^^^^^^^^^^^^^^^
error[E0308]: mismatched types
--> src/environment.rs:538:21
|
535 | .get_or_fetch_from_url(
| --------------------- arguments to this function are incorrect
...
538 | download_client.clone(),
| ^^^^^^^^^^^^^^^^^^^^^^^ expected struct `rattler_networking::AuthenticatedClient`, found struct `reqwest::Client`
|
note: associated function defined here
--> /Users/ytatar/.cargo/git/checkouts/rattler-3fdeff0ba0214908/a697958/crates/rattler/src/package_cache.rs:179:18
|
179 | pub async fn get_or_fetch_from_url(
| ^^^^^^^^^^^^^^^^^^^^^
For more information about this error, try `rustc --explain E0308`.
error: could not compile `pixi` due to 2 previous errors
error: failed to compile `pixi v0.0.2 (https://github.com/prefix-dev/pixi.git#3e10b5a5)`, intermediate artifacts can be found at `/var/folders/jb/1wt_13hj6hnfb8zry_121wcc0000gn/T/cargo-installAiuAH0`
Am I maybe misunderstanding something regarding the installation instructions?
Get a default set of virtual dependencies for the platforms. So that the solve can continue using them even if you're not on that platform.
Add local virtual dependencies to see if this system can install it.
First version, install using pip
command like micromamba does.
Add support for it in the .toml
.
Added to the lock-file.
Multi-platform packages will require the need to have platform-specific dependencies or configurations.
Cargo supports this through the [target]
section.
[dependencies]
bla = "3.1.*"
windows_specific = { version = "3.1.*", only_windows = true }
or cargos approach:
[dependencies]
bla = "3.1.*"
[target.'sel(win)'.dependencies]
windows_specific = "3.1"
I like the latter the most because its clearer that this is an exception that only applies in a certain scenario.
Cargo uses the cfg(..)
syntax in its selectors. I think this is pretty cool because it allows you to do more complex things like selecting for two platforms or for specific architectures (arm vs x86).
We could also simplify things a little bit and only use platform names as selectors e.g. osx-arm64
vs sel(osx)
. Cargo also supports specifying the target directly.
I would opt to go for the full selector experience because it provides a lot of flexibility.
I was thinking of a custom command to update versions of all files in the project but for that i would need the version of the pax.toml
What would be nice if you could access the project files and possibly some more things in your commands.
eg.
[project]
name = "pax"
version = "0.1.0"
description = "Add a short description here"
authors = ["Ruben Arts <[email protected]>"]
channels = ["conda-forge"]
platforms = ["linux-64", "win-64", "osx-64", "osx-arm64"]
[commands]
custom_command = "echo project_name: $PROJECT_NAME version: $PROJECT_VERSION desc: $PROJECT_DESCRIPTION"
and possibly more info about the config you give.
We need to have a proper readme with licenses!
We should add the ability to take a build package (see #12 ) and upload that to a specific channel.
In python world it would be far from ideal to need both a pyproject.toml
and a pixi.toml
. Ideally pixi
would also be able to read a pyproject.toml
file and use that as a project file.
Some care needs to be taken in the distinction between conda
and pypi
packages. Since the pyproject.toml
originally specifies pypi
packages I think we would need to add another section for the conda
packages.
[tool.pixi.dependencies]
some_conda_package = "*"
Currently, if the channels are updated in a project manifest (pax.toml) the lockfile is not regenerated. This is wrong.
We might want to rethink the conda-lock file format a little because in itself it is not entirely enough to produce the same environment as using just conda. In fact, some crucial information required during installation is missing (spoiler: noarch
).
Ideally, we would be able to recreate the RepoDataRecord
struct for every package in the conda-lock file. This struct contains all the information you would expect in the conda-meta
folder of an environment and a bunch of these fields are required to properly install packages. However, the current conda-lock file format doesn't include all the fields present in either the RepodataRecord
nor the PackageRecord
(aka, the think that resides in the repodata.json). Fields like license
, arch
, noarch
, platform
and subdir
are missing. Although we can probably reconstruct some of this information from the subdir name we cant reconstruct all. Especially the missing noarch
might be problem (is it a python noarch package or not for instance).
See also this related issue: conda/conda-lock#363 (comment)
In general, reducing the total output of the tool should be something to strive for.
Currently, when you want to do something with the environment created by the tool you have to run pax sync
. The sync
command will update the lock file and the conda environment.
It has the following steps:
The sync
command is a fragile thing, if you forget to run the sync
command you might run a command from an "old" environment which might be hard to notice.
Ideally, we get rid of the command completely and execute the above steps when working with the environment. The following commands should be affected.
pax add
pax run
Special care has to be taken with the run
command because we should strive to make the overhead of the run command as small as possible. The reason is that if you have programs like formatters/linters that themselves only run for a short period of time adding a large overhead would kill productivity. We should strive to keep the overhead <100ms.
For a project to be added to homebrew-core, it must be over 2 months(?) old and have >= 100 70 75 stars.
Until then I could build a recipe and host it on my own tap and migrate it when all requirements are met.
It would be nice to have the executables for the different systems directly available in the releases artifacts instead of the tar/zipballs of the binaries, similar to mamba-org/micromamba-releases.
This makes future automated installations (for example in CI) a bit easier.
As a user I would like to get the info that is being read from my system by the tool to easily debug issues. This is also helpful for future issue resolving where users could print out so it gets easier to investigate.
conda info
is a good example:
active environment : None
shell level : 0
user config file : /home/rarts/.condarc
populated config files : /usr/share/conda/condarc.d/defaults.yaml
conda version : 4.13.0
conda-build version : not installed
python version : 3.11.3.final.0
virtual packages : __cuda=12.1=0
__linux=6.2.15=0
__glibc=2.36=0
__unix=0=0
__archspec=1=x86_64
base environment : /usr (read only)
conda av data dir : /usr/etc/conda
conda av metadata url : None
channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /var/cache/conda/pkgs
/home/rarts/.conda/pkgs
envs directories : /home/rarts/.conda/envs
/usr/envs
platform : linux-64
user-agent : conda/4.13.0 requests/2.28.1 CPython/3.11.3 Linux/6.2.15-200.fc37.x86_64 fedora/37 glibc/2.36
UID:GID : 1000:1000
netrc file : None
offline mode : False
We could possibly extend this with project information
When the lockfile is out of date it is completely regenerated. We should change this behavior so that the content of the previous lockfile resides in the SolverTask::locked_packages
. This will make sure that package versions remain stable unless this would result in an unsolvable situation.
Check if the binaries that are created in CI actually work.
Basically download them and run them to see if they work!
I want to "readd" a package but in a different version:
pax add python
✔ Added python 3.11.*
# Ah damn I wanted a different version
pax add python=3.8
Right now you would get an error saying:
➜ pax add python=3.11
error: could not determine any available versions for python on linux-64. Either the package could not be found or version constraints on other dependencies result in a conflict.
Caused by:
unsolvable
Thanks a lot for working and open sourcing pixi!
It would be nice if the installation from binaries (that are available in the GitHub release) was documented in the README, for people that are not familiar with Rust/Cargo.
This probably requires some extra experimentation.
As a building block, we should add the ability to build a conda package from a project. If we can always construct a conda package from a pixi project we can enable source builds pretty easily by building any source package into a conda package first.
I envision a command called
pixi build
that (on success) constructs a new package archive containing the build output of the project. Building the project simply calls the "build" command/script (#11) and passes in an environment variable or replaces something in the "build" command/script (with minijinja or whatever). The responsibility of outputting the correct build files is left up to the user.
I don't know how to handle creating buildstrings, multiple outputs, binary/text patching, etc. Probably have to investigate that a little more.
We would like to be able to install conda packages on your system like condax. This can then function as a replacement to scoop/homebrew/npm.
pax install cowpy
Will create an environment called cowpy
(or whatever) and a binary is added: ~/.pax/bin/cowpy
(the shim). The shim file, when executed, will source the environment and execute cowpy
in there.
After running the above command, this should just work:
cowpy
Note that this is not tied to a project or environment. It "installs" the cowpy
command globally.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.