sarugaku / passa Goto Github PK
View Code? Open in Web Editor NEWResolver implementation and toolset for generating and interacting with Pipfile and Pipfile.lock.
License: ISC License
Resolver implementation and toolset for generating and interacting with Pipfile and Pipfile.lock.
License: ISC License
I think this is the straightforward thing to do?
Similar to pypa/pipenv#2329. Currently we only detect the return value of WheelBuilder._build_one
, but this masks the underlying error.
#40 might be the better approach to build wheels anyway.
Traceback (most recent call last):
File "runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "runpy.py", line 85, in _run_code
exec(code, run_globals)
File "passa/__main__.py", line 6, in <module>
main()
File "passa/cli/__init__.py", line 47, in main
result = f(options)
File "passa/cli/_base.py", line 73, in main
return type(self).__dict__["parsed_main"](options)
File "passa/cli/lock.py", line 13, in main
locker = BasicLocker(project)
File "passa/internals/lockers.py", line 79, in __init__
project.pipfile, "packages",
File "passa/internals/lockers.py", line 28, in _get_requirements
for name, package in model.get(section_name, {}).items()
File "passa/internals/lockers.py", line 26, in <dictcomp>
return {identify_requirment(r): r for r in (
File "passa/internals/lockers.py", line 28, in <genexpr>
for name, package in model.get(section_name, {}).items()
File "requirementslib/models/requirements.py", line 856, in from_pipfile
r = FileRequirement.from_pipfile(name, pipfile)
File "requirementslib/models/requirements.py", line 402, in from_pipfile
return cls(**arg_dict)
File "<attrs generated init 297509fcbd532345fa7483f0babc819f4f2b9a5a>", line 27, in __init__
File "attr/validators.py", line 106, in __call__
self.validator(inst, attr, value)
File "requirementslib/models/utils.py", line 157, in validate_path
raise ValueError("Invalid path {0!r}", format(value))
ValueError: ('Invalid path {0!r}', './../../lib/python')
Currently editables are installed by running setup.py develop
, which means it needs to be setuptools-based. I was looking into how flit sets up an editable installation and realise this can be replaced by a *.dist-info and a pth file. It could be interesting to look into.
A problem with alternative build systems is that there needs to be a way to understand what exactly needs to be installed. setuptools has setup.py, but other systems can define things differently. This would probably require a spec change in either Pipfile (to specify what backend to use for an editable), or an extension in PEP 518 (so a build system can define how its edditables are should be installed). It would be a good topic to ponder on in the long run.
$ passa add apache-airflow
...
Traceback (most recent call last):
File "/home/hawk/.pyenv/versions/3.7.0/bin/passa", line 11, in <module>
load_entry_point('passa', 'console_scripts', 'passa')()
File "/home/hawk/git/passa/src/passa/cli/__init__.py", line 47, in main
result = f(options)
File "/home/hawk/git/passa/src/passa/cli/add.py", line 95, in main
return super(Command, self).main(options)
File "/home/hawk/git/passa/src/passa/cli/_base.py", line 73, in main
return type(self).__dict__["parsed_main"](options)
File "/home/hawk/git/passa/src/passa/cli/add.py", line 47, in main
lockfile_diff = project.difference_lockfile(prev_lockfile)
File "/home/hawk/git/passa/src/passa/internals/projects.py", line 222, in difference_lockfile
that = lockfile[section_name]._data
TypeError: 'NoneType' object is not subscriptable
Retroactive tracking for the fix in 73f0f95.
[packages]
jinja2 = { git = 'https://github.com/pallets/jinja' }
flask = '*'
Jinja2 would have only one candidate (the VCS checkout), but this cannot satisfy Jinja2 (>=2.10)
(specified by Flask). I’m not sure how this should be handled… maybe the checkout needs to read the package’s version. But how should we store it… just using Requirement.specifiers
?
Experiementing on requirementslib, various markers disappeared. apipkg
, for example, got the following from PIpenv:
python_version >= '2.7' and python_version != '3.1.*' and python_version != '3.2.*' and
python_version != '3.3.*' and python_version != '3.0.*'
Did it get them from requires_python
? Because not all python_version
markers are lost. pathlib2
, for example, keeps it fine. (Passa even gets more because it correctly merges markers!)
:(
When combining parent metasets and child ones, one should use &
operator, while combining metasets derived from different parents, one should use |
.
I really want to mention this. Not sure what is causing it, or how is best to improve, but this problem needs to be tracked.
The phenomenon is the resolver ends relatively quickly, but it takes forever to output the lock file :/
This fails if locked on 2.7:
black ; python_version >= '3.6'
I’m not completely sure what is the best approach. Should passa just always ignore requires-python when resolving? Should it still respect it, but choose the lowest version possible when it’s not possible to resolve?
Erroring out is not an option because e.g. pip-shims specifies modutil; python_version >= '3.7'
, and locking would fail for everything under if a project depends on it. But pip-shims is intended to be used for those lower versions.
Currently git+
requirements will all be put together. Not very straightforward.
The output format could also use some improvements. Those VCS requirements, for example, could have a comment in front of them with the package name. This would be much easier for human eyes to parse than having to find the #egg=
suffix.
Also see pypa/pipenv#2818. (The template would need to be configurable as well.)
There are a lot of duplicate code in there right now. I think it is possible to refactor, but how?
I feel like Distlib has a lot of pieces available to do most of what pip does. It might be a good idea to release a version with pip internals all replaced by Distlib, and work our way back to feature parity with pip. This would be better than the other way around because we can rely on user reports to know what is missing, instead of receiving complaints about things “no longer” works.
pypa/pipenv#2826. I think Passa currently also locks editable VCS to a specific ref.
Really though, non-local editable is just very confusing to me, to the point I suspect the whole design is just bad.
This is probably not a problem of Passa, but only a TODO when we eventually integrate Passa into Pipenv.
The problem is that Pipenv passes the line directly to pip, so say if you have
"foo": {
"editable": true,
"markers": "os_name == 'nt'",
"path": "."
}
This becomes -e .; os_name == 'nt'
, and Pipenv would run pip like this
pip install -e ".; os_name == 'nt'"
The solution is probably to drop the markers completely from the line. They are not needed since Pipenv already evaluates them before running the pip command anyway.
TODO for self.
https://github.com/sarugaku/passa/blob/master/Pipfile.lock#L29
But requirementslib has an unconditional dependency to attrs. This is not right.
I think it is not very difficult now that subcommands are broken into their own modules. Just some logic abstraction to switch between ArgumentParser
and subparsers.add_parser
https://travis-ci.com/sarugaku/passa/jobs/158215360
Something’s wrong with pip-shims, I assume?
🤦♂️
https://distlib.readthedocs.io/en/latest/reference.html#DependencyGraph
Gotta look into that.
pypa/pipenv#2834. Currently Passa would require a Pipfile at all times. This might be the wrong assumption.
Per @techalchemy 's comment on this (now locked) issue: pypa/pipenv#966 (comment)
To those who have been waiting patiently for a fix, please do try the resolver mentioned above— we are eager to see if it meets your needs. It implements proper backtracking and resolution and shouldn’t handle this upgrade strategy
I would love to help test this out, but it's not clear to me from the docs here how to test the workflow where I only want to upgrade dependencies (and their dependencies) that are necessary to the current operation (e.g. upgrading a top-level dependency). Does passa also support --selective-upgrade
or --keep-outdated
, or is there some other way of achieving similar behavior?
hey @uranusjr, looks like the passa cli is erroring out:
$ passa --help
Traceback (most recent call last):
File "/Users/johria/.pyenv/versions/3.6.5/bin/passa", line 7, in <module>
from passa.cli import cli
ImportError: cannot import name 'cli'
@uranusjr reopening pypa/pipenv#2800 (comment) here.
would the passa lock
command be able to do what I want? and if not, would you be open to supporting it?
thanks!
cc @techalchemy
Run passa lock
on the add_test_frost
branch.
Traceback (most recent call last):
File "C:\Users\frostming\.virtualenvs\passa-ZPGQzv6O\Scripts\passa-script.py", line 11, in <module>
load_entry_point('passa', 'console_scripts', 'passa')()
File "d:\workspace\passa\src\passa\cli\__init__.py", line 47, in main
result = f(options)
File "d:\workspace\passa\src\passa\cli\lock.py", line 14, in run
return lock(project=options.project)
File "d:\workspace\passa\src\passa\actions\lock.py", line 12, in lock
success = lock(locker)
File "d:\workspace\passa\src\passa\operations\lock.py", line 13, in lock
locker.lock()
File "d:\workspace\passa\src\passa\models\lockers.py", line 158, in lock
provider.collected_requires_pythons,
File "d:\workspace\passa\src\passa\models\metadata.py", line 210, in set_metadata
dependencies, pythons, copy.deepcopy(traces),
File "d:\workspace\passa\src\passa\models\metadata.py", line 161, in _calculate_metasets_mapping
dependencies, pythons, key, trace, all_metasets,
File "d:\workspace\passa\src\passa\models\metadata.py", line 140, in _build_metasets
python = pythons[key]
KeyError: 'plette[validation]'
Say A requires B[x], but C requires B[y], and D requires B (no extra), there is no guarantee which one will be present in the lock file. Some consolidation logic is required to merge extras.
(I found this because Passa depends on Plette[validation], but RequirementsLib depends on Plette without specifying the extra, and now the Pipfile.lock no longer contains the extra flag.)
Currently they are not treated specially, and are locked like any other packages. I don’t think this is the correct behaviour? They should be explicitly excluded from Pipfile.lock, right?
Also, what should we do if they are added to Pipfile? I can see three possibilities:
add
Currently we treat foo
and foo[extra]
as distinct entries. If foo
depends on nothing, and foo[extra]
depends on bar<1
in v1, but bar>=1
in v2, an input of
foo
foo[extra]
bar<1
would result something in
foo==2
foo[extra]==1
bar==0.9
This is only theoratical at the moment, since Pipenv does not allow this kind of inputs, but I want to fix this so we can allow things like
[packages]
mypackage = {editable = true, path = "."}
[dev-packages]
mypackage = {editable = true, path = ".", extras = ["tests"]}
in the future.
And verbose reporter based on the current reporter code. Some cleanup is needed to put correct things in stdout and stderr.
Not sure if this is already done. pypa/pipenv#3026.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.