Code Monkey home page Code Monkey logo

productmd's Introduction

ProductMD

Build Status Documentation Status

ProductMD is a Python library providing parsers for metadata related to composes and installation media.

Documentation

http://productmd.readthedocs.io/en/latest/

Building

Build requires

  • Six: Python 2 and 3 Compatibility Library
  • pip install six
  • Fedora: dnf install python-six python3-six

Build

To see all options run:

make help

Build with Tito

Read TITO.md for instructions.

Testing

Run from checkout dir:

make test

productmd's People

Contributors

adamwill avatar ausil avatar dmach avatar goosemania avatar hlin avatar jcpunk avatar jkonecny12 avatar keszybz avatar ktdreyer avatar lkocman avatar lubomir avatar michalhaluza avatar onosek avatar pholica avatar supakeen avatar t-feng avatar tkdchen avatar tojaj avatar velezd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

productmd's Issues

Run validations sooner

Right now the field level validations on objects such as Image are performed just before serializing the data. This can lead to situation where incorrect value is put in, but the problem is only discovered much later. It's almost impossible to detect what went wrong in such cases.

It would be cool if we could do the validations immediately after setting the property.

Add support for euleros

EulerOS is a commercial Linux distribution developed for enterprise application environments, So it is recommended to add support for euleros

New release needed

Could you please cut a release of productmd and send it out to Fedora and RHEL/EPEL at some point? I'd like to use the 'unique image identifier' stuff I wrote and you merged (thanks!) in fedfind to replace its own image identification approach, but can't until it's in an official release and the packages are updated. Thanks!

get_release_id(major_version=True) doesn't work well with layered products using X.Y.Z version scheme

Hello

I'm using compose.info.get_release_id(major_version=True) on a product/release which is three digit version scheme e.g 99.0.0 for this example. And compose.info.get_release_id(major_version=True) is not behaving correctly.

I see that def get_major_version(version, remove=1) is parametrized and seems like we're simply always using default remove value.

>>> compose = productmd.compose.Compose("/home/lkocman/Product-99.0.0-BASEPRODUCT-99")
>>> compose.info
<productmd.composeinfo.ComposeInfo object at 0x931e90>

>>> compose.info.get_release_id()
u'Product-99.0.0-BASEPRODUCT-99'

# I would expect Product-99-BASEPRODUCT-99 as 99 is the major version
>>> compose.info.get_release_id(major_version=True)
u'Product-99.0-BASEPRODUCT-99'

Please add Microsoft Hyper-V image type to supported formats.

$ diff -u images.py.orig images.py
--- images.py.orig	2018-05-04 23:56:33.992217758 -0500
+++ images.py	2018-05-04 23:56:43.975208248 -0500
@@ -45,7 +45,7 @@
 #: supported image formats, they match with file suffix
 SUPPORTED_IMAGE_FORMATS = ['iso', 'qcow', 'qcow2', 'raw', 'raw.xz', 'rhevm.ova',
                            'sda.raw', 'tar.gz', 'tar.xz', 'vagrant-libvirt.box', 'vagrant-virtualbox.box',
-                           'vdi', 'vmdk', 'vmx', 'vsphere.ova']
+                           'vdi', 'vhd', 'vmdk', 'vmx', 'vsphere.ova']
 
 
 class Images(productmd.common.MetadataBase):

`Compose()` cannot find composes in s3

When initializing productmd with a compose in an s3 bucket HTTPS URL, the Compose() constructor cannot find the compose metadata.

Steps to reproduce

  1. Generate a compose
  2. Upload the compose to an s3 bucket
  3. Initialize a Compose() object with that URL:
productmd.compose.Compose('https://s3.example.com/mybucket/MYPRODUCT-1.0-YYYYMMDD.0')
# Access any metadata, eg:
compose.info

Actual results

RuntimeError: Failed to load metadata from https://s3.example.com/mybucket/MYPRODUCT-1.0-YYYYMMDD.0

Expected results

productmd loads the compose metadata from the correct location.

Additional Details

"directories" in s3 buckets are an illusion. All objects names in s3 are key names and "/" is just another character in the key name. One side effect is that s3 will return 404 errors for parent "directories", since those are objects that do not exist.

In s3 (AWS, Ceph RGW, etc):

GET https://s3.example.com/mybucket/sampledirectory/flower.jpg 200 OK
GET https://s3.example.com/mybucket/sampledirectory/ 404 NOT FOUND

Contrast this with Apache, with mod_autoindex enabled:

GET https://www.example.com/mybucket/sampledirectory/flower.jpg 200 OK
GET https://www.example.com/mybucket/sampledirectory/ 200 OK

The Compose() constructor calls _file_exists() on the MYPRODUCT-1.0-YYYYMMDD.0/compose/metadata directory. This is fine for local filesystems and Apache with mod_autoindex, but it fails for s3. I recommend that we search for an exact filename instead of the directory. compose/metadata/composeinfo.json is a well-known one that has been present for many years, and it works in my testing.

Unified ISO flag in metadata

What would be the best way to track a unified ISO in the metadata? Before productmd 1.0, there was an extra property for each image. I'm not sure that is the best way forward, as for most use-cases it would be set to false.

add call from_json()/from_compose()

As of now there is currently no way to process composeinfo from json or pungi
This is a request to add such call. Quite a lot of tools relies on processing composeinfo from pungi and this is currently big blocker.

Parse of opensuse treeinfo fails.

When parsing opensuse treeinfo[0] productmd raise the expcetion[1]
Platform has images but is not referenced in platform list: x86_64, set()
In my opinion they(suse) use it pretty simplified but will really helpful to be able to parse their treeinfo too.

[0] http://download.opensuse.org/distribution/leap/15.1/repo/oss/.treeinfo
[1]

raise ValueError("Platform has images but is not referenced in platform list: %s, %s"

Parsing compose IDs is (practically) impossible if `-` is allowed in product short name

I'm not sure if this is an issue or a cry for help...:)

Consider these compose IDs (one a real Fedora compose ID, one a made-up layered product compose ID from the test suite, since I don't know where to find a real RHEL layered product compose ID):

Fedora-Atomic-25-20170205.0 (product short name: Fedora-Atomic)
F-22-updates-BASE-3-updates-20160622.n.0 (product short name: F)

Now I read the code, all kinds of other subtleties are possible, which makes parsing compose IDs just super fun. But you could, I think, cope with many if not all of them by counting - characters. Maybe. I don't know. Life is awful.

However, as long as - characters are allowed in the product short name, all bets are off, because it just becomes entirely impossible to figure out which -s are being used as some kind of field separator by productmd, and which are just part of the product name (as in the first example).

I've been doing this to parse compose IDs in Fedoraland:

(name, version, _) = compose_id.rsplit('-', 2)
(date, typ, respin) = productmd.composeinfo.get_date_type_respin(compose_id)

The rsplit trick is to handle there being dashes in the shortname, and it relies on there being a known number of dashes after the shortname. But now I look at the test cases and code upstream, it's clear this is rather specific to my use cases and is relying on the fact that we don't have layered products (or at least, if we do, I don't care about any of 'em), or any type_suffixes (at least ones with dashes in them? I don't really know what these are), and I certainly don't care about RHEL 5 composes. But clearly someone does care about all those cases, so putting a generic compose ID parser in productmd (which is what I was trying to do) appears to be really rather futile, or at the least, would be very complex and have quite a bit of magic knowledge in it.

So...I don't know what to do. Would it be worthwhile making it a rule that product short names can't have dashes in them, and changing Fedora's configurations appropriately (to use FedoraAtomic, FedoraCloud and FedoraDocker as the short names, or something)? Or should I just give up on generically parsing compose IDs and rejig everything I have that does it, somehow (probably by requiring a round trip to the compose location or to PDC to fetch the full compose metadata, unfortunately)? Currently we have various things that get useful information just by parsing the compose ID because that's the only thing they get from a fedmsg they parse, and it's nice to be able to avoid a round trip. Doing this kind of rejigging for fedfind would also be rather tricky architecturally, but that's more or less my problem and not yours...

1.33: sphinx warnings

+ /usr/bin/python3 setup.py build_sphinx -b man
running build_sphinx
Running Sphinx v4.0.2
making output directory... done
WARNING: html_static_path entry '_static' does not exist
building [mo]: targets for 0 po files that are out of date
building [man]: all manpages
updating environment: [new config] 18 added, 0 changed, 0 removed
reading sources... [100%] treeinfo-1.1
/home/tkloczko/rpmbuild/BUILD/productmd-1.33/productmd/images.py:docstring of productmd.images:25: WARNING: Block quote ends without a blank line; unexpected unindent.
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
writing... python-productmd-terminology.7 { } python-productmd-compose.3 { } python-productmd-composeinfo.3 { } python-productmd-discinfo.3 { } python-productmd-images.3 { } python-productmd-rpms.3 { } python-productmd-treeinfo.3 { } python-productmd-composeinfo.5 { } python-productmd-discinfo.5 { } python-productmd-images.5 { } python-productmd-rpms.5 { } python-productmd-treeinfo.5 { } done
build succeeded, 2 warnings.

more options for BaseProduct.type

The latest in-development version of my Ceph 3.0 layered product is based on a not-yet-available version of RHEL (7.4).

I was hoping I could specify this in the compose metadata (setting base_product_type = "nightly" in my Pungi config), but it looks like we restrict BaseProduct.type to productmd.common.RELEASE_TYPES:

RELEASE_TYPES = [
    "fast",
    "ga",
    "updates",
    "eus",
    "aus",
    "els",
]

What would be the best way to indicate in the compose metadata that my layered product is based on a not-yet-released base product?

Bad error handling when loading existing image metadata

When the metadata contains for example unknown type or format (say it was created with newer version of the library), the error message will say that the file is not valid json. That's not true and definitely not helpful.

`invalid literal for int() with base 10` for build_timestamp

Some .treeinfo files have build_timestamp in the format of float and not int.
The code and comments support that both int and float numbers can be used for build_timestamp, however in one place there is an assumption that only int is there. E.g. this breaks parsing treeinfo provided by OracleLinux repos.
See more details on a user report here and here.

Suggestion: support float type for build_timestamp across the code base.

RFE: support querying over HTTP

User story: As a developer, I can query a compose over HTTP and retrieve a list of all builds in the compose.

Currently I can instantiate a Compose object like so:

    compose = productmd.compose.Compose(path)
    ... (and then query the list of builds, etc)

The problem is that "path" must be a local or NFS path, like /mnt/redhat/path/to/my/compose.

Within Red Hat, some of our NFS shares are restricted across data centers, and the composes are only accessible via HTTP. It would be great to improve productmd to support querying over HTTP instead of NFS.

ComposeInfo.get_variants() crash with "arch" keyword

Specifying the "arch" keyword arg to ComposeInfo.get_variants() results in an AttributeError.

Consider the following sample code:

import productmd.compose
c = productmd.compose.Compose('/mnt/redhat/nightly/latest-RHEL-7')

I can call get_variants() to discover the list of variants in this composeinfo:

print(c.info.get_variants())

But when I try to filter the list by arch:

print(c.info.get_variants(arch='x86_64'))

I get an AttributeError:

  File "/usr/lib/python2.7/site-packages/productmd/composeinfo.py", line 181, in get_variants
    return self.variants.get_variants(*args, **kwargs)
  File "/usr/lib/python2.7/site-packages/productmd/composeinfo.py", line 582, in get_variants
    if arch and arch not in self.arches + ["src"]:
AttributeError: 'Variants' object has no attribute 'arches'

Add a README

Cheers!

Can you add a README describing a bit about the project? How would you use it?

Importantly, how does one run the tests?

RFE: is it possible to restart making github releases?๐Ÿค”

On create github release entry is created email notification to those whom have set in your repo the web UI Watch->Releases.
gh release can contain additional comments (li changelog) or additional assets like release tar balls (by default it contains only assets from git tag) however all those part are not obligatory.
In simplest variant gh release can be empty because subiekt of the sent email contains git tag name.

I'm asking because my automation process uses those email notifications by trying to make preliminary automated upgrades of building packages, which allows saving some time on maintaining packaging procedures.
Probably other people may be interested to be instantly informed about release new version as well.

Documentation and examples of generate gh releases:
https://docs.github.com/en/repositories/releasing-projects-on-github/managing-releases-in-a-repository
https://cli.github.com/manual/gh_release_upload/
jbms/sphinx-immaterial#282
https://github.com/marketplace/actions/github-release
https://pgjones.dev/blog/trusted-plublishing-2023/
jbms/sphinx-immaterial#281 (comment)

Treeinfo can not process variant with dash in ID

  File "/usr/lib/python2.7/site-packages/productmd/common.py", line 201, in load
    self.deserialize(parser)
  File "/usr/lib/python2.7/site-packages/productmd/treeinfo.py", line 127, in deserialize
    self.variants.deserialize(parser)
  File "/usr/lib/python2.7/site-packages/productmd/treeinfo.py", line 379, in deserialize
    self.add(variant, variant_id=variant.uid)
  File "/usr/lib/python2.7/site-packages/productmd/composeinfo.py", line 527, in add
    variant.validate()
  File "/usr/lib/python2.7/site-packages/productmd/common.py", line 185, in validate
    method()
  File "/usr/lib/python2.7/site-packages/productmd/treeinfo.py", line 609, in _validate_id
    raise ValueError("Invalid character '-' in variant ID: %s" % self.id)

Relevant part of the file being loaded:

[variant-Bar-Tools]
id = Bar-Tools
name = Tools
packages = Packages
repository = .
type = variant
uid = Bar-Tools

productmd on pypi cannot by installed with pip

I have setup.py with this snippet:

#!/usr/bin/env python
# -*- coding: utf-8 -*-

from setuptools import setup, find_packages

setup(
    install_requires=['pdc-client', 'productmd'],

when I install this with python setup.py develop I end up with error:

Reading https://pypi.python.org/simple/productmd/
Best match: productmd 1.3
Downloading https://pypi.python.org/packages/d0/33/e4b0c10d3a1f3beaa164338774ead0f7ab6eaf662db4c4a8103f8cf6e515/productmd-1.3.tar.bz2#md5=ef826d4ca3d259294f12545fd5e17618
Processing productmd-1.3.tar.bz2
Writing /tmp/easy_install-BLQA0i/productmd-1.3/setup.cfg
Running productmd-1.3/setup.py -q bdist_egg --dist-dir /tmp/easy_install-BLQA0i/productmd-1.3/egg-dist-tmp-NeluU9
error: requirements.txt: No such file or directory

Apparently current version of productmd (1.3) on pipy references requirements.txt which isn't present in the upstream archive:

# Manage dependencies in requirements.txt
with open('requirements.txt') as f:
    reqs = f.read().splitlines()

Please upload current version of productmd there.

Note: with pip install productmd I don't see this issue and productmd is installed correctly.

Debuginfo ISOs can not be identified in metadata

If there is an ISO image with debuginfo and it's added to image manifest, there is no way to distinguish it from the regular ISO with binary packages. The file name and path can be different, but both have type dvd and format iso.

Maybe having a different type would be a good solution? Maybe something like dvd-debuginfo?

find_composes prints traceback if compose directory cannot be read

    ci_list = find_composes()", 
  File \"/usr/lib/python2.6/site-packages/distill3_utils/find_latest.py\", line 58, in find_composes", 
    compose = productmd.compose.Compose(compose_path)", 
  File \"/usr/lib/python2.6/site-packages/productmd/compose.py\", line 70, in __init__", 
    for i in os.listdir(compose_path):", 
        "OSError: [Errno 13] Permission denied: '.../released/F-20/20-Beta-RC1'"

where .../released/F-20/20-Beta-RC1 is unreadable by the user.

need to create .composeinfo file

beaker only supports importing .composeinfo files for composes. They are not going to port to using productmd as they plan to use pdc as the datastore. we need to make .composeinfo files to ensure that composes are importable into beaker

Improve error reporting

When trying to load a compose with productmd.compose.Compose class, if some file is not found, an AttributeError gets reported. Because of the lazy loading, there is no reliable way to even catch such exceptions.

1.33: pytest is failing

+ /usr/bin/python3 -Bm pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
Using --randomly-seed=2664403714
rootdir: /home/tkloczko/rpmbuild/BUILD/productmd-1.33
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, cases-3.4.6, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, asyncio-0.15.1, toolbox-0.5, xprocess-0.17.1, flaky-3.7.0, aiohttp-0.3.0, checkdocs-2.7.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, Faker-8.4.0, cov-2.12.1, randomly-3.8.0, hypothesis-6.13.12
collected 92 items

tests/test_modules.py ..                                                                                                                                             [  2%]
tests/test_images.py ...........                                                                                                                                     [ 14%]
tests/test_compose.py ....                                                                                                                                           [ 18%]
tests/test_extra_files.py ........                                                                                                                                   [ 27%]
tests/test_rpms.py ..                                                                                                                                                [ 29%]
tests/test_common.py ...                                                                                                                                             [ 32%]
. F                                                                                                                                                                  [ 34%]
tests/test_common.py .........                                                                                                                                       [ 43%]
tests/test_discinfo.py ........                                                                                                                                      [ 52%]
tests/test_header.py ...                                                                                                                                             [ 56%]
tests/test_composeinfo.py ...............................                                                                                                            [ 90%]
tests/test_treeinfo.py .........                                                                                                                                     [100%]

================================================================================= FAILURES =================================================================================
_______________________________________________________________________________ test session _______________________________________________________________________________

cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f0abc4e0040>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

/usr/lib/python3.8/site-packages/_pytest/runner.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_HookCaller 'pytest_runtest_call'>, args = (), kwargs = {'item': <CheckdocsItem project>}, notincall = set()

    def __call__(self, *args, **kwargs):
        if args:
            raise TypeError("hook calling supports only keyword arguments")
        assert not self.is_historic()
        if self.spec and self.spec.argnames:
            notincall = (
                set(self.spec.argnames) - set(["__multicall__"]) - set(kwargs.keys())
            )
            if notincall:
                warnings.warn(
                    "Argument(s) {} which are declared in the hookspec "
                    "can not be found in this hook call".format(tuple(notincall)),
                    stacklevel=2,
                )
>       return self._hookexec(self, self.get_hookimpls(), kwargs)

/usr/lib/python3.8/site-packages/pluggy/hooks.py:286:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_pytest.config.PytestPluginManager object at 0x7f0c55eb1130>, hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/usr/lib/python3.8/site-packages/_pytest/runner...pper name='/dev/null' mode='r' encoding='UTF-8'>> _state='suspended' _in_suspended=False> _capture_fixture=None>>, ...]
kwargs = {'item': <CheckdocsItem project>}

    def _hookexec(self, hook, methods, kwargs):
        # called from all hookcaller instances.
        # enable_tracing will set its own wrapping function at self._inner_hookexec
>       return self._inner_hookexec(hook, methods, kwargs)

/usr/lib/python3.8/site-packages/pluggy/manager.py:93:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

hook = <_HookCaller 'pytest_runtest_call'>
methods = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/usr/lib/python3.8/site-packages/_pytest/runner...pper name='/dev/null' mode='r' encoding='UTF-8'>> _state='suspended' _in_suspended=False> _capture_fixture=None>>, ...]
kwargs = {'item': <CheckdocsItem project>}

>   self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
        methods,
        kwargs,
        firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
    )

/usr/lib/python3.8/site-packages/pluggy/manager.py:84:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/usr/lib/python3.8/site-packages/_pytest/runner...pper name='/dev/null' mode='r' encoding='UTF-8'>> _state='suspended' _in_suspended=False> _capture_fixture=None>>, ...]
caller_kwargs = {'item': <CheckdocsItem project>}, firstresult = False

    def _multicall(hook_impls, caller_kwargs, firstresult=False):
        """Execute a call into multiple python functions/methods and return the
        result(s).

        ``caller_kwargs`` comes from _HookCaller.__call__().
        """
        __tracebackhide__ = True
        results = []
        excinfo = None
        try:  # run impl and wrapper setup functions in a loop
            teardowns = []
            try:
                for hook_impl in reversed(hook_impls):
                    try:
                        args = [caller_kwargs[argname] for argname in hook_impl.argnames]
                    except KeyError:
                        for argname in hook_impl.argnames:
                            if argname not in caller_kwargs:
                                raise HookCallError(
                                    "hook call must provide argument %r" % (argname,)
                                )

                    if hook_impl.hookwrapper:
                        try:
                            gen = hook_impl.function(*args)
                            next(gen)  # first yield
                            teardowns.append(gen)
                        except StopIteration:
                            _raise_wrapfail(gen, "did not yield")
                    else:
                        res = hook_impl.function(*args)
                        if res is not None:
                            results.append(res)
                            if firstresult:  # halt further impl calls
                                break
            except BaseException:
                excinfo = sys.exc_info()
        finally:
            if firstresult:  # first result hooks return a single value
                outcome = _Result(results[0] if results else None, excinfo)
            else:
                outcome = _Result(results, excinfo)

            # run all wrapper post-yield blocks
            for gen in reversed(teardowns):
                try:
                    gen.send(outcome)
                    _raise_wrapfail(gen, "has second yield")
                except StopIteration:
                    pass

>           return outcome.get_result()

/usr/lib/python3.8/site-packages/pluggy/callers.py:208:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <pluggy.callers._Result object at 0x7f0abc3fd310>

    def get_result(self):
        """Get the result(s) for this hook call.

        If the hook was marked as a ``firstresult`` only a single value
        will be returned otherwise a list of results.
        """
        __tracebackhide__ = True
        if self._excinfo is None:
            return self._result
        else:
            ex = self._excinfo
            if _py3:
>               raise ex[1].with_traceback(ex[2])

/usr/lib/python3.8/site-packages/pluggy/callers.py:80:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

hook_impls = [<HookImpl plugin_name='runner', plugin=<module '_pytest.runner' from '/usr/lib/python3.8/site-packages/_pytest/runner...pper name='/dev/null' mode='r' encoding='UTF-8'>> _state='suspended' _in_suspended=False> _capture_fixture=None>>, ...]
caller_kwargs = {'item': <CheckdocsItem project>}, firstresult = False

    def _multicall(hook_impls, caller_kwargs, firstresult=False):
        """Execute a call into multiple python functions/methods and return the
        result(s).

        ``caller_kwargs`` comes from _HookCaller.__call__().
        """
        __tracebackhide__ = True
        results = []
        excinfo = None
        try:  # run impl and wrapper setup functions in a loop
            teardowns = []
            try:
                for hook_impl in reversed(hook_impls):
                    try:
                        args = [caller_kwargs[argname] for argname in hook_impl.argnames]
                    except KeyError:
                        for argname in hook_impl.argnames:
                            if argname not in caller_kwargs:
                                raise HookCallError(
                                    "hook call must provide argument %r" % (argname,)
                                )

                    if hook_impl.hookwrapper:
                        try:
                            gen = hook_impl.function(*args)
                            next(gen)  # first yield
                            teardowns.append(gen)
                        except StopIteration:
                            _raise_wrapfail(gen, "did not yield")
                    else:
>                       res = hook_impl.function(*args)

/usr/lib/python3.8/site-packages/pluggy/callers.py:187:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

item = <CheckdocsItem project>

    def pytest_runtest_call(item: Item) -> None:
        _update_current_test_var(item, "call")
        try:
            del sys.last_type
            del sys.last_value
            del sys.last_traceback
        except AttributeError:
            pass
        try:
            item.runtest()
        except Exception as e:
            # Store trace info to allow postmortem debugging
            sys.last_type = type(e)
            sys.last_value = e
            assert e.__traceback__ is not None
            # Skip *this* frame
            sys.last_traceback = e.__traceback__.tb_next
>           raise e

/usr/lib/python3.8/site-packages/_pytest/runner.py:170:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

item = <CheckdocsItem project>

    def pytest_runtest_call(item: Item) -> None:
        _update_current_test_var(item, "call")
        try:
            del sys.last_type
            del sys.last_value
            del sys.last_traceback
        except AttributeError:
            pass
        try:
>           item.runtest()

/usr/lib/python3.8/site-packages/_pytest/runner.py:162:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <CheckdocsItem project>

    def runtest(self):
>       desc = self.get_long_description()

/usr/lib/python3.8/site-packages/pytest_checkdocs/__init__.py:29:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <CheckdocsItem project>

    def get_long_description(self):
>       return Description.from_md(ensure_clean(pep517.meta.load('.').metadata))

/usr/lib/python3.8/site-packages/pytest_checkdocs/__init__.py:60:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

root = '.'

    def load(root):
        """
        Given a source directory (root) of a package,
        return an importlib.metadata.Distribution object
        with metadata build from that package.
        """
        root = os.path.expanduser(root)
        system = compat_system(root)
        builder = functools.partial(build, source_dir=root, system=system)
>       path = Path(build_as_zip(builder))

/usr/lib/python3.8/site-packages/pep517/meta.py:71:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

builder = functools.partial(<function build at 0x7f0abdddfee0>, source_dir='.', system={'build-backend': 'setuptools.build_meta:__legacy__', 'requires': ['setuptools', 'wheel']})

    def build_as_zip(builder=build):
        with tempdir() as out_dir:
>           builder(dest=out_dir)

/usr/lib/python3.8/site-packages/pep517/meta.py:58:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

source_dir = '.', dest = '/tmp/tmp442eb7cw', system = {'build-backend': 'setuptools.build_meta:__legacy__', 'requires': ['setuptools', 'wheel']}

    def build(source_dir='.', dest=None, system=None):
        system = system or load_system(source_dir)
        dest = os.path.join(source_dir, dest or 'dist')
        mkdir_p(dest)
        validate_system(system)
>       hooks = Pep517HookCaller(
            source_dir, system['build-backend'], system.get('backend-path')
        )

/usr/lib/python3.8/site-packages/pep517/meta.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <pep517.wrappers.Pep517HookCaller object at 0x7f0abc3fdee0>, source_dir = '.', build_backend = 'setuptools.build_meta:__legacy__', backend_path = None
runner = <function default_subprocess_runner at 0x7f0abdde08b0>, python_executable = None

    def __init__(
            self,
            source_dir,
            build_backend,
            backend_path=None,
            runner=None,
            python_executable=None,
    ):
        if runner is None:
            runner = default_subprocess_runner

>       self.source_dir = abspath(source_dir)

/usr/lib/python3.8/site-packages/pep517/wrappers.py:133:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

path = '.'

    def abspath(path):
        """Return an absolute path."""
        path = os.fspath(path)
        if not isabs(path):
            if isinstance(path, bytes):
                cwd = os.getcwdb()
            else:
>               cwd = os.getcwd()
E               FileNotFoundError: [Errno 2] No such file or directory

/usr/lib64/python3.8/posixpath.py:379: FileNotFoundError
========================================================================= short test summary info ==========================================================================
FAILED ::project - FileNotFoundError: [Errno 2] No such file or directory
======================================================================= 1 failed, 90 passed in 0.96s =======================================================================

MetadataBase.loads performs validation, but MetadataBase.load appears not to do validation

def load(self, f):
"""
Load data from a file.
:param f: file-like object or path to file
:type f: file or str
"""
with open_file_obj(f) as f:
parser = self.parse_file(f)
self.deserialize(parser)
def loads(self, s):
"""
Load data from a string.
:param s: input data
:type s: str
"""
io = six.StringIO()
io.write(s)
io.seek(0)
self.load(io)
self.validate()

Why does MetadataBase.loads() call self.validate() but MetadataBase.load() does not?

I would kind of expect self.validate() to be called from MetadataBase.load() so that it would be inherited by MetadataBase.loads()`

pip install fails for v1.3 on Python 3

Description:
Running pip install productmd fails with the following error on Python 3:

    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-build-aap2fih6/productmd/setup.py", line 25, in <module>
        with open('requirements.txt') as f:
    FileNotFoundError: [Errno 2] No such file or directory: 'requirements.txt'

How to reproduce:

  1. Create and activate virtenv with Python 3
$ mkvirtualenv -p python3.5 pmd
  1. Install productmd with pip
$ pip install productmd

RFE: add module metadata

We should be providing a module.json file in the compose metadata that has info on the modules included in a compose. similiar to rpms.json and images.json

Allow GA label

My product doesn't has always single phase - GA. It doesn't make a sense so have Alpha, Beta, ... RC labels for that.

Please add GA to supported labels. Without number.

Documentation for treeinfo 1.2 is missing

CentOS DVD1 contains .treeinfo with following header:

[header]
type = productmd.treeinfo
version = 1.2

I wanted to read some documentation explaining what each option mean, but could only find for 1.1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.