pypa / packaging Goto Github PK
View Code? Open in Web Editor NEWCore utilities for Python packages
Home Page: https://packaging.pypa.io/
License: Other
Core utilities for Python packages
Home Page: https://packaging.pypa.io/
License: Other
To reproduce do the following
virtualenv venv && venv/bin/pip install pyparsing==2.0.1 && venv/bin/pip install packaging==16.3 && venv/bin/python -c "import packaging.requirements"
youll get
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "venv/local/lib/python2.7/site-packages/packaging/requirements.py", line 59, in <module>
MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
TypeError: __call__() takes exactly 2 arguments (1 given)
In pyparsing it was fixed here: https://sourceforge.net/p/pyparsing/code/256/ (v 2.0.2)
As in the title, ==0.1.*
doesn't match the version 0.1+upstream.2
. I discovered this as a result of specifying a requirement ~=0.1.0
, and getting 0.1
instead of 0.1+upstream.2
. I can produce a reproducing example if necessary.
It appears from this job and probably others that the introduction of Setuptools 34 is breaking the tox setup for packaging. The traceback implicates pypa/setuptools#951 and pypa/pip#4264 as the cause.
Thoughts about something like Version.bump_major() -> Version
?
It would be nice to keep PyPI releases and git tags in sync :)
The pypi page looks strange:
https://pypi.python.org/pypi/packaging
The reST markup does not get rendered.
Suppose you have a version and a specifier, both as strings. The version is not PEP 440-compatible, but the specifier is. Creating version/specifier objects for each results in the version never meeting specifier:
>>> from packaging import version, specifiers
>>> v = version.parse('1.2.3.goofy')
>>> v
<LegacyVersion('1.2.3.goofy')>
>>> spec = specifiers.SpecifierSet('>=1.2')
>>> spec
<SpecifierSet('>=1.2')>
>>> v in spec
False
Presumably, this is because LegacyVersion
s "will always sort as less than a Version
instance."
While some people can just switch to PEP 440-compatible versions, not everyone can. In particular, I'm using this to parse non-Python versions for some tools. I could write my own code for this, but since I already need the packaging
package for other stuff, I figured I should just use what already exists.
For my particular issue, I can think of the following solutions:
legacy=False
argument to SpecifierSet
that will force all the specifiers it creates to be LegacySpecifier
s. That way, if you know you need to handle legacy versions, you can enable it.Version
s and LegacyVersion
s more compatible so that LegacyVersion
s don't always count as older.Thoughts?
Related to pypa/pip#5800. No matter whether the repository fetcher is added or not, I feel it would be very beneficial if there is a well-tested way to parse entries in a simple API page. Something like
>>> Distribution.from_filename('zope.interface-4.5.0.tar.gz', name='zope.interface')
SourceDistribution(name='zope.interface', version='4.5.0', filename='zope.interface-4.5.0.tar.gz')
>>> Distribution.from_filename('zope.interface-4.5.0-cp36-cp36m-win_amd64.whl', name='zope.interface')
WheelDistribution(name='zope.interface', version='4.5.0', filename='zope.interface-4.5.0-cp36-cp36m-win_amd64.whl')
I'm trying to help fix the issue of broken package descriptions on PyPI (e.g. pypa/setuptools#1390).
Following a suggestion of @di in pypa/setuptools#1390 (comment), I would do the following:
validation
module to packaging that provides a class with a validate()
method and an error
list attribute (an interface that can be easily used in Web frameworks like Pyramid and Django).InvalidMetadataError
if the specs are not met (instead of continuing silently generating a broken PKG-INFO file, as today).Does that sound like something that makes sense?
Should I proceed and open a PR for a validation
module?
Can the list of Categories at https://pypi.python.org/pypi/packaging/ be updated to also include 3.5 and 3.6, assuming the package supports them?
hello,
I would like to get the equivalent of:
import pip
[(i.key,i.version) for i in pip.get_installed_distributions()]
with packaging
package.
Is it possible ? Can anyone point me to an example ?
From setuptools issue.
Currently if someone has a legacy specifier anywhere in their dependency chain it will raise a InvalidSpecifier
exception. This is fine if you want strict adherence to PEP 440, however it doesn't help much if you have existing specifiers anywhere.
The question is, how should we handle this?
We could just allow anything in the specifier fields and let the fact that LegacyVersion
and Version
are now directory comparable to each other handle things. The problem with this is that something like >=1.1.5.jaraco.20140924
would essentially mean "any PEP 440 compatible version, and then anything >=1.1.5.jaraco.20140924
using the old style scheme.
Another solution is to add a LegacySpecifier
which implements the old style of behavior. Projects like setuptools then would do a similar thing as is done now with versions where they first attempt to use the new behavior, and if it fails to parse it falls back to the old behavior. The problem with this one is, it's going to end up adding mixed specifier syntax where you might have some dependencies using the PEP 440 syntax and some dependencies using the old style. It also has issues with combining multiple specifiers, what do we do if two projects have a dependency on "foo>=1.0", which will be a Specifier
instance and a dependency on foo>=1.0.wat
which would be a LegacySpecifier
instance.
I have a package (github.com/MetPX/Sarracenia) that has fewer features on windows than it does on Linux. So it has fewer dependencies on windows than it has on Linux. So I needed platform specific dependencies. Googling had me come up with the following:
install_requires=[
...
"psutil" ] \
+ ( [ "xattr" ] if not sys.platform.startswith("win") else [] )
which when I install with:
git clone https://github.com/MetPX/sarracenia
cd sarracenia
pip install -e .
does the right thing, in that it installs the xattr package when on linux, but not when on windows.
but if I upload the package to pypi, and then use pip install on windows, it tries to install xattr.
so... who's bug is that?
It'd be nice if there was an easy way to get a "normalized" version, e.g.:
>>> from packaging.version import parse
>>> parse('4.1') == parse('4.1.0')
True
>>> parse('4.1').normalized_version
'4.1.0'
I'm not sure what the expected support for unicode in version strings is, though at the moment the result is an unexpected error.
I found this while looking at readthedocs/readthedocs.org#2506:
In [32]: Version(u'1.2.3+941_ะะพะฑะฐะฒะธัั_ะฝะฐ_ั')
---------------------------------------------------------------------------
UnicodeEncodeError Traceback (most recent call last)
<ipython-input-32-febb99ecd5ec> in <module>()
----> 1 Version(u'1.2.3+941_ะะพะฑะฐะฒะธัั_ะฝะฐ_ั')
/tmp/venv/lib/python2.7/site-packages/packaging/version.pyc in __init__(self, version)
200 match = self._regex.search(version)
201 if not match:
--> 202 raise InvalidVersion("Invalid version: '{0}'".format(version))
203
204 # Store the parsed out pieces of the version
UnicodeEncodeError: 'ascii' codec can't encode characters in position 10-17: ordinal not in range(128)
Even if unicode characters in versions aren't expected to be supported, it would be great to error about that directly, rather than emitting a UnicodeEncodeError
.
Version info:
Running pip install -i https://devpi.net/hpk/dev/+simple/ devpi-server==2.2.0dev4
sometimes results in:
Could not find a version that satisfies the requirement devpi-common<2.1,>=2.0.6.dev0 (from devpi-server==2.2.0dev4->-r requirements.txt (line 2)) (from versions: 1.2, 2.0.0, 2.0.1, 2.0.2, 2.0.3, 2.0.4, 2.0.5, 2.0.6.dev2)
The issue appears to be in packaging, as
python -c "from packaging.specifiers import SpecifierSet; print(list(SpecifierSet('<2.1,>=2.0.6.dev0').filter(['1.2', '2.0.0', '2.0.1', '2.0.2', '2.0.3', '2.0.4', '2.0.5', '2.0.6.dev2'])))"
will return different results on different, adjacent calls.
I keep writing the same code over and over, to parse a filename and return information on the content - distribution name, version and file type, specifically. Would such a routine be a suitable addition to the packaging project?
I'd propose to support wheels, eggs, sdists and bdist_wininst installers, as those are the types package handling tools tend to need to deal with (and apart from wininst, the filename formats are either straightforward or documented).
If this seems like a good idea, I'll submit a PR.
dev-requirements needs pytest
installed for this to work
.. code-block:: console
$ # Create a virtualenv and activate it
$ pip install --requirement dev-requirements.txt
$ pip install --editable .
You are now ready to run the tests and build the documentation.
Running tests
~~~~~~~~~~~~~
packaging unit tests are found in the ``tests/`` directory and are
designed to be run using `pytest`_. `pytest`_ will discover the tests
automatically, so all you have to do is:
.. code-block:: console
$ py.test
...
62746 passed in 220.43 seconds
I'm trying to install the mysqlclient package in a virtualenv on a fresh Ubuntu 16, and when I run:
pip install --cache-dir /tmp --no-binary mysqlclient mysqlclient
it fails with the traceback:
Traceback (most recent call last):
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/basecommand.py", line 215, in main
status = self.run(options, args)
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/commands/install.py", line 335, in run
wb.build(autobuilding=True)
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/wheel.py", line 749, in build
self.requirement_set.prepare_files(self.finder)
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/req/req_set.py", line 380, in prepare_files
ignore_dependencies=self.ignore_dependencies))
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/req/req_set.py", line 634, in _prepare_file
abstract_dist.prep_for_dist()
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/req/req_set.py", line 129, in prep_for_dist
self.req_to_install.run_egg_info()
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/req/req_install.py", line 412, in run_egg_info
self.setup_py, self.name,
File "/myproject/build/.env/local/lib/python2.7/site-packages/pip/req/req_install.py", line 387, in setup_py
import setuptools # noqa
File "/myproject/build/.env/local/lib/python2.7/site-packages/setuptools/__init__.py", line 12, in <module>
import setuptools.version
File "/myproject/build/.env/local/lib/python2.7/site-packages/setuptools/version.py", line 1, in <module>
import pkg_resources
File "/myproject/build/.env/local/lib/python2.7/site-packages/pkg_resources/__init__.py", line 72, in <module>
import packaging.requirements
File "/myproject/build/.env/local/lib/python2.7/site-packages/packaging/requirements.py", line 59, in <module>
MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
TypeError: __call__() takes exactly 2 arguments (1 given)
This SO answer suggests it's a problem in requirements.txt, specifically on this line, and recommends that:
MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
should instead be changed to:
MARKER_EXPR = originalTextFor(MARKER_EXPR(""))("marker")
When Setuptools adopted Packaging 16.x for the marker implementation, it lost support for the deprecated platform.python_implementation
marker, which prevents packages using that deprecated marker from installing under Setuptools 20.2+.
@s-t-e-v-e-n-k Can you develop a patch to support this legacy expectation?
import sys
import threading
from packaging.requirements import Requirement
N_THREADS = 32
def target():
Requirement("x[]")
def main(argv):
threads = []
for i in range(N_THREADS):
threads.append(threading.Thread(target=target))
for t in threads:
t.start()
for t in threads:
t.join()
print("Ok")
if __name__ == "__main__":
main(sys.argv)
This fails every once in a while. Put in a loop in your terminal for best results.
We currently support platform.python_implementation
and platform_python_implementation
. setuptools
' previous fork of packaging did support python_implementation
until it was replaced by platform_python_implementation
in pypa/setuptools@3bd5118#diff-0c04a5cfcbd2a4bdd1121db108b79d3d.
python_implementation
existed in setuptools
until that commit and the death of MarkerEvaluation
, see https://github.com/pypa/setuptools/blob/f8b1293c408bbb652bec3f2ae6e5b4f33f3ca55e/pkg_resources/__init__.py#L1389 for the full (?) list of what was supported there. It also briefly existed in PEP 508 until February (https://hg.python.org/peps/rev/655a101719a5 replaced it with platform_python_implementation
).
The one thing we're missing from what setuptools
historically supported is python_implementation
; we have everything else. This adds more duplication, but its removal broke some packages (e.g., some versions of html5lib).
Using TensorFlow backend.
2018-11-27 19:19:40.819599: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Now we build the Actor model
/home/lkx/ไธ่ฝฝ/DRL/ActorNetwork.py:65: UserWarning: Update yourDense
call to the Keras 2 API:Dense(91, name="a_h0", activation=<function ..., kernel_initializer=<function ...)
h0 = Dense(self.HIDDEN1_UNITS, activation=self.h_acti, init=glorot_normal, name='a_h0')(S)
Traceback (most recent call last):
File "ddpg.py", line 216, in
playGame(DDPG_config, train_indicator=1)
File "ddpg.py", line 75, in playGame
actor = ActorNetwork(sess, state_dim, action_dim, DDPG_config)
File "/home/lkx/ไธ่ฝฝ/DRL/ActorNetwork.py", line 41, in init
self.model, self.weights, self.state = self.create_actor_network(state_size, action_size)
File "/home/lkx/ไธ่ฝฝ/DRL/ActorNetwork.py", line 65, in create_actor_network
h0 = Dense(self.HIDDEN1_UNITS, activation=self.h_acti, init=glorot_normal, name='a_h0')(S)
File "/home/lkx/.local/lib/python3.5/site-packages/keras/engine/base_layer.py", line 431, in call
self.build(unpack_singleton(input_shapes))
File "/home/lkx/.local/lib/python3.5/site-packages/keras/layers/core.py", line 866, in build
constraint=self.kernel_constraint)
File "/home/lkx/.local/lib/python3.5/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/home/lkx/.local/lib/python3.5/site-packages/keras/engine/base_layer.py", line 252, in add_weight
constraint=constraint)
File "/home/lkx/.local/lib/python3.5/site-packages/keras/backend/tensorflow_backend.py", line 402, in variable
v = tf.Variable(value, dtype=tf.as_dtype(dtype), name=name)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variables.py", line 183, in call
return cls._variable_v1_call(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variables.py", line 146, in _variable_v1_call
aggregation=aggregation)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variables.py", line 125, in
previous_getter = lambda **kwargs: default_variable_creator(None, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variable_scope.py", line 2444, in default_variable_creator
expected_shape=expected_shape, import_scope=import_scope)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variables.py", line 187, in call
return super(VariableMetaclass, cls).call(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variables.py", line 1329, in init
constraint=constraint)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/variables.py", line 1437, in _init_from_args
initial_value(), name="initial_value", dtype=dtype)
TypeError: call() missing 1 required positional argument: 'shape'
I just run the http://knowledgedefinednetworking.org/ DRL, what should i do ?
Consider the following snippet:
# We can determine if we're going to allow pre-releases by looking to
# see if any of the underlying items supports them. If none of them do
# and this item is a pre-release then we do not allow it and we can
# short circuit that here.
# Note: This means that 1.0.dev1 would not be contained in something
# like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
if not prereleases and item.is_prerelease:
return False
The actual code does not comply with the stated purpose in the comment above.
SpecifierSet(">=1.0.devabc,>0.0.dev0", False).contains(Version("1.0.dev1")) == False
What should be happening is:
SpecifierSet(">=1.0", False).contains(Version("1.1.dev1")) == False
SpecifierSet(">=1.0", True).contains(Version("1.1.dev1")) == True
SpecifierSet(">=1.0,>0.0.dev0", False).contains(Version("1.1.dev1")) == True
SpecifierSet(">=1.0,>0.0.dev0", True).contains(Version("1.1.dev1")) == True
VCS links like git+https://github.com/foo/bar
, git+https://github.com/foo/bar#egg=baz
and git+https://github.com/foo/bar#egg=baz-1.0
are not supported. Similar to #120
They are supported by pip and setuptools.
>>> packaging.requirements.Requirement('git+https://github.com/foo/bar')
Traceback (most recent call last):
File "packaging/requirements.py", line 97, in __init__
requirement_string[e.loc:e.loc + 8]))
packaging.requirements.InvalidRequirement: Invalid requirement, parse error at "'+https:/'"
While discussing how to implement wheel installation support into setuptools
, @dholth mentioned that the wheel
package had its own implementation of PEP-425 parsing, that was different than the one in pip.
In order not to duplicate another piece of code, wouldn't it be better to lift pep425tags
into packaging
, which is vendored in both pip
and setuptools
?
And since the main consumer of these tags are functions that parse wheel filenames, what do you think about lifting the pip.wheel:Wheel
class as well? (which would be better named WheelFilename
, since pretty much all it does is parse information from a potential wheel filename to help deciding whether it should be downloaded/installed or not).
Using packaging 16.5 I try to find the lowest version from a given SpecifierSet:
$ python
Python 2.7.10 (default, May 24 2015, 14:46:10) [GCC] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from packaging.specifiers import SpecifierSet
>>> s = SpecifierSet("!=1.2.0,!=1.3b1,<1.3,>=1.1.2")
>>> min(s)
<Specifier('!=1.2.0')>
I expected that the minimum version is 1.1.2
.
Found in the release of setuptools 20.2.1, which vendors in packaging 16.4. It appears that packaging.requirements.Requirement does not comply with PEP-440's "Whitespace between a conditional operator and the following version identifier is optional, as is the whitespace around the commas.", per https://www.python.org/dev/peps/pep-0440/#version-specifiers. Instead, it fails when there is whitespace.
>>> from pkg_resources.extern.packaging.requirements import Requirement
>>> Requirement('Jinja2 >=2.7, <3')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/tmp/v/local/lib/python2.7/site-packages/pkg_resources/_vendor/packaging/requirements.py", line 94, in __init__
requirement_string[e.loc:e.loc + 8]))
pkg_resources._vendor.packaging.requirements.InvalidRequirement: Invalid requirement, parse error at "'<3'"
But it succeeds when the whitespace is removed
>>> Requirement('Jinja2 >=2.7,<3')
<Requirement('Jinja2<3,>=2.7')>
From irc today:
17:07 < dstufft> like python -m packaging.version sort 4.5 6.7 3.4 -> 3.4 4.5 6.7
[...]
17:10 < dstufft> probably something like python -m packaging.version sort --verify 3.4 4.5 6.7 might be reasonable
PEP440 gives the example that ~= 1.4.5a4
should be equivalent to >= 1.4.5a4, == 1.4.*
, however:
>>> from packaging.specifiers import SpecifierSet
>>> '1.4.6' in SpecifierSet('~=1.4.5a4', prereleases=True) # incorrect
False
>>> '1.4.6' in SpecifierSet('>=1.4.5a4, ==1.4.*', prereleases=True) # correct
True
It appears there has been a misinterpretation, treating ~=1.4.5a4
as >= 1.4.5a4, == 1.4.5.*
. Compare this to PEP440: "If a pre-release, post-release or developmental release is named in a compatible release clause as V.N.suffix , then the suffix is ignored when determining the required prefix match".
My environment:
(venv) $ pip freeze
appdirs==1.4.2
attrs==16.3.0
chicken-turtle-util==4.0.1
lxml==3.7.3
numpy==1.12.0
packaging==16.8
patool==1.12
pkginfo==1.4.1
py==1.4.32
pypandoc==1.3.3
pyparsing==2.2.0
pytest==3.0.6
pytest-capturelog==0.7
requests==2.13.0
six==1.10.0
(venv) $ python --version
Python 3.5.2
packaging.requirements.Requirement('django==1.6+azeze')
is correctly parsed but packaging.requirements.Requirement('django>=1.6+azeze')
raises InvalidRequirement: Invalid requirement, parse error at "'+azeze'"
There might be a problem with
packaging/packaging/markers.py
Line 267 in c22fbd8
The line is using the definition of the Environment Marker Variable python_version
according to PEP 508. But what if the Python version will consist of more than 3 characters?
I originally posted this as a question on Stack Overflow https://stackoverflow.com/questions/48119291/whats-with-the-definition-of-environment-marker-variable-python-version-in-pe Just as Martijn Pieters wrote there, something like ".".join(platform.python_version_tuple()[:2])
would fix the problem.
I guess, the PEP would have to be changed first?
Similarly to what pyca/cryptography did, let's relicense as dual Apache License 2.0 and BSD 2 clause. This will allow folks with GPLv2 codebases to use packaging while still retaining the additional protections of Apache License 2.0. More information can be found in pyca/cryptography#1209. If you're one of the people who have contributed, please respond to this issue stating whether or not you give permission for your code to be relicensed as dual Apache License 2.0 and 2 clause BSD.
LICENSE
file appropriately.setup.py
classifiers.__about__.py
.Hello, please excuse me barging in here, it was suggested in the issue I submitted in setuptools pypa/setuptools#1049 that I should go upstream (i.e. here) with this.
First of all, I'm using Python 3.6.0, setuptools 36.0.0.
In my setup.py
I have the following:
setuptools.setup(
name='my-package', ...,
python_requires='3.5', ...,
...)
And when running python3.6 setup.py sdist
(or other *dist*
) I get the following error:
error in my-package setup command: 'python_requires' must be a string
containing valid version specifiers; Invalid specifier: '3.5'
But 3.5
is valid according to: https://www.python.org/dev/peps/pep-0345/#requires-python
This field specifies the Python version(s) that the distribution is guaranteed to be compatible with.
Version numbers must be in the format specified in Version Specifiers .
Examples:
Requires-Python: 2.5 Requires-Python: >2.1 Requires-Python: >=2.3.4 Requires-Python: >=2.5,<2.7
When I specify >=3.5
or >3.5
or ==3.5
or <3.5
or <=3.5
setuptools don't complain. But they somehow can't tolerate plain version number. I also checked some other version numbers, like 3
or 3.6
. Getting same results.
I'm not sure if the problem is limited to python_requires
field, but that's the only place I've seen it so far.
We have built some automation around releases and testing that involve inspecting the version, to determine what type of release something is, and where to publish it (etc.).
Currently we are using a regex to parse pre PEP-440 versions. I was planning on using packaging
to replace this, but it appears that packaging doesn't expose this information in a structured way.
Since pipenv
relies on pythonfinder
to find python2.7
and friends, and since pythonfinder
relies on packaging
's version number calculation stuff, I'm submitting this issue here.
Summary
If packaging
can accept 1.0+
as a version number equal to 1.0
, that will avoid a crash in pythonfinder
that will result in pipenv
working for a colleague.
Steps to reproduce
pipenv install
2.7.15+
, as on Ubuntu 18.10 File ".../pythonfinder/models/python.py", line 366, in parse
raise ValueError("Not a valid python version: %r" % version)
ValueError: Not a valid python version: <LegacyVersion('2.7.15+')>
Other details:
$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 18.10
Release: 18.10
Codename: cosmic
$ /usr/bin/python2 --version
Python 2.7.15+
Proposed fix
Accept 2.7.15+
as equivalent to 2.7.15
.
I'll work on a pull request now.
Hi,
According to https://www.python.org/dev/peps/pep-0440/#post-release-spelling , versions like '1.0-r4' should be normalized to '1.0.post4'. But running a simple test, this does not work:
$ python
Python 2.7.10 (default, May 24 2015, 14:46:10) [GCC] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import packaging
>>> packaging.__version__
'15.1'
>>> from packaging.version import parse
>>> v1 = parse('1.0.post4')
>>> v2 = parse('1.0.post4')
>>> v1 == v2
True
>>> v2 = parse('1.0.r4')
>>> v1 == v2
False
>>>
$ python3
Python 3.6.4 (default, Jan 6 2018, 02:24:08)
[GCC 7.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import packaging.requirements
>>> packaging.requirements.Requirement("pip @ file://localhost/localbuilds/pip-1.3.1-py33-none-any.whl")
<Requirement('pip@ file://localhost/localbuilds/pip-1.3.1-py33-none-any.whl')>
>>> packaging.requirements.Requirement("pip @ file:///localbuilds/pip-1.3.1-py33-none-any.whl")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/uken/.local/lib64/python3.6/site-packages/packaging/requirements.py", line 101, in __init__
raise InvalidRequirement("Invalid URL given")
packaging.requirements.InvalidRequirement: Invalid URL given
According to PEP 508 the following strings should all be parseable:
"pip @ file://localhost/localbuilds/pip-1.3.1-py33-none-any.whl"
"pip @ file:///localbuilds/pip-1.3.1-py33-none-any.whl"
"pip @ /localbuilds/pip-1.3.1-py33-none-any.whl"
"pip @ localbuilds/pip-1.3.1-py33-none-any.whl"
(I have fed them into the "test program" included in PEP 508 that uses Parsley
for parsing. Note that there are several typos in the grammar in the document text)
The following logic may be flawed:
packaging/packaging/requirements.py
Lines 102 to 104 in d2ed39a
It's not clear why this would be valid:
>>> packaging.version.LegacyVersion('rest-hooks-1.0.4')
<LegacyVersion('rest-hooks-1.0.4')>
What is the purpose of the LegacyVersion class? Is it the equivalent of distutils.version.LooseVersion
, something that basically does its best to parse anything? Is there really a plausible case for a version that doesn't start with a number?
This is why some of the recent PRs have build failures
I noticed this in pip 8.1.2, but I think the issue is in here.
I have an "extras_require" in my setup.py like:
extras_require={':"win32" in sys_platform': ['pywin32==219','wmi==1.4.9'],}
When trying to install the package with pip, I get:
Invalid requirement: 'pywin32==219; win32 in "sys_platform"'
Something seems to be quoting the variable rather than the value. I'd expect:
'pywin32==219; "win32" in sys_platform'
Digging through the code, I see markers.py, line 140:
return '{0} {1} "{2}"'.format(*marker)
This would work for a marker using '==', but not one using 'in', where the order of the variable and value are typically switched. I think this is the problem. This works as expected in pip 8.1.1.
In [1]: from packaging.requirements import Requirement
In [2]: Requirement('-e git://hello/world/git#egg=Hello')
---------------------------------------------------------------------------
InvalidRequirement Traceback (most recent call last)
<ipython-input-2-5b0607f7b9e2> in <module>()
----> 1 Requirement('-e git://hello/world/git#egg=Hello')
/home/sontek/venvs/tox/local/lib/python2.7/site-packages/packaging/requirements.pyc in __init__(self, requirement_string)
92 raise InvalidRequirement(
93 "Invalid requirement, parse error at \"{0!r}\"".format(
---> 94 requirement_string[e.loc:e.loc + 8]))
95
96 self.name = req.name
InvalidRequirement: Invalid requirement, parse error at "'-e git:/'"
Hi,
Request for explanation :
I recently discovered what I think looks like an inconsistency in the packaging
package (as vendored by pip-9.0.1).
I've built for my work a wheel with this exact version : 3.0.32-rdo.1.0.0
When I use pip with pip install mypackage==3.0.32-rdo.1.0.0
it successfully identify the correct wheel and install it. (1)
But if I use pip with pip install mypackage~=3.0.32-rdo.1.0.0
( ~= instead of == ) then I get an error from packaging
package (2) :
Invalid requirement: 'mypackage~=3.0.32-rdo.1.0.0'
Traceback (most recent call last):
File "/home/gregory/foo/lib/python2.7/site-packages/pip/req/req_install.py", line 82, in __init__
req = Requirement(req)
File "/home/gregory/foo/lib/python2.7/site-packages/pip/_vendor/packaging/requirements.py", line 96, in __init__
requirement_string[e.loc:e.loc + 8]))
InvalidRequirement: Invalid requirement, parse error at "'do.1.0.0'"
(1) + (2) : this is what I think looks like an inconsistency or I miss something.. ?
pip install
with --find-links
so.in it I have many version of this mypackage ; some of them don't have the extra -rdo.x.y.z
subversion part (or tag is a more appropriate term maybe).
When I so install with pip install mypackage~=3.0.32
(or >= ) then my -rdo.x.y.z
other wheel isn't resolved/installed but only a "normal" version (==3.0.32
) without this subversion or version tag.
is it normal ?
I'm not sure using such version subpart is very common / nor really officially supported actually .. ?
if you have any good pointer on that ?
Thx for any info :)
packaging test's fail with the release candidate build of of python3.5.2:
=================================== FAILURES ===================================
_________________ TestDefaultEnvironment.test_matches_expected _________________
self = <tests.test_markers.TestDefaultEnvironment object at 0x7ff51c8e1278>
@pytest.mark.skipif(not hasattr(sys, 'implementation'),
reason='sys.implementation does not exist')
def test_matches_expected(self):
environment = default_environment()
iver = "{0.major}.{0.minor}.{0.micro}".format(
sys.implementation.version
)
if sys.implementation.version.releaselevel != "final":
iver = "{0}{1}[0]{2}".format(
iver,
sys.implementation.version.releaselevel,
sys.implementation.version.serial,
)
assert environment == { "implementation_name": sys.implementation.name, "implementation_version": iver, "os_name": os.name, "platform_machine": platform.machine(), "platform_release": platform.release(), "platform_system": platform.system(), "platform_version": platform.version(), "python_full_version": platform.python_version(), "platform_python_implementation": platform.python_implementation(), "python_version": platform.python_version()[:3], "sys_platform": sys.platform, }
E assert {'implementat...'x86_64', ...} == {'implementati...'x86_64', ...}
E Omitting 10 identical items, use -v to show
E Differing items:
E {'implementation_version': '3.5.2c1'} != {'implementation_version': '3.5.2candidate[0]1'}
E Use -v to get the full diff
tests/test_markers.py:134: AssertionError
======== 1 failed, 28376 passed, 2 skipped, 1 xfailed in 41.70 seconds =========
This is more a question than an issue.
Would you accept a PR to add a new function which can create a valid version for RPM packages from a given version? Same could be done for DEB versions.
This is something that would be very useful when packaging python software and dealing with alpha/beta/rc/post releases .
Probably a small markup problem.
https://pypi.python.org/pypi/packaging/14.5
PyPI should be more lenient. I have a PR to do this (https://bitbucket.org/pypa/pypi/pull-request/60/fix-bb-214-make-rst-rendering-not-fail-for/diff).
The following was unexpected for me, and smells like a bug ?
import sys
from packaging.markers import Marker
print(sys.version_info)
print(Marker('python_version < "2.7.6"').evaluate())
Outputing:
sys.version_info(major=2, minor=7, micro=10, releaselevel='final', serial=0)
True
Currently is_prerelease allows you to differentiate between a final release and pre_release. However if your project uses both dev and rc releases there is no way using packaging to differentiate.
We need to resort to something similar to:
suffix = version.public[len(version.base_version):]
components = version.base_version.split('.') + [suffix]
if suffix == '' or suffix.startswith('rc'):
A new attribute that allows access to the release suffix would be helpful since this is already being parsed by packaging. Also making the components available would be useful for example:
'1.2.3rc9' - > ['1', '2', '3', 'rc9']
'1.2.3-rc9' -> ['1', '2', '3', 'rc9']
'1.2.3' -> ['1', '2', '3']
Hello,
it 's been almost a day I try to run my function "regina.py", but I can not get this makes me errors like this:
import regina.py
Traceback (most recent call last):
File "", line 1, in
File "/Applications/PyCharm CE.app/Contents/helpers/pydev/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, _args, *_kwargs)
File "/Users/Sokhna/Documents/mgp-test-i11vf/regina.py", line 13, in
import diane
File "/Applications/PyCharm CE.app/Contents/helpers/pydev/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, _args, *_kwargs)
File "/Users/Sokhna/Documents/mgp-test-i11vf/diane.py", line 19, in
import nayru
File "/Applications/PyCharm CE.app/Contents/helpers/pydev/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, _args, *_kwargs)
File "/Users/Sokhna/Documents/mgp-test-i11vf/nayru.py", line 1, in
import falcon
File "/Applications/PyCharm CE.app/Contents/helpers/pydev/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, _args, *_kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/falcon/init.py", line 32, in
from falcon.api import API, DEFAULT_MEDIA_TYPE # NOQA
File "/Applications/PyCharm CE.app/Contents/helpers/pydev/pydev_import_hook.py", line 21, in do_import
module = self._system_import(name, _args, *_kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/falcon/api.py", line 18, in
from falcon import api_helpers as helpers
ImportError: cannot import name api_helpers
I just started python and there I am disappointed because I don't understand anything.
Can anyone help me please :( :( :(
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.