Code Monkey home page Code Monkey logo

project-config's People

Contributors

sourcery-ai-bot avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

Forkers

sourcery-ai-bot

project-config's Issues

Add base field Types

Currently project_config supports the specification of field types as part of Model declaration. Right now I am just using pure python types during testing (and they aren't yet checked/enforced). Going forward, providing custom base types are preferred, in order to facilitate JSONSchema generation, and for mapping types according to different storage mediums (json, yaml, xml etc.).

  • Base types should provide mapping to JSON types
  • Custom types should also provide a mapping to JSON types, if one cannot be inferred.

e.g.

import project_config as pc


class Extractor(pc.Model):
    __metadata__ = {"additionalProperties": True}
    name = pc.Field(str, required=True)
    inherit_from = pc.Field(str)
    pip_url = pc.Field(str)
    variant = pc.Field(str)
    namespace = pc.Field(str)
    config = pc.Field(str)
    label = pc.Field(str)
    logo_url = pc.Field(str)
    executable = pc.Field(str)
    settings = pc.Array(Setting)
    docs = pc.Field(str)
    settings_group_validation = pc.Array(SettingGroupValidation)
    commands = pc.Field(str)

would become:

import project_config as pc


class Extractor(pc.Model):
    __metadata__ = {"additionalProperties": True}
    name = pc.Field(pc.String, required=True)
    inherit_from = pc.Field(pc.String)
    pip_url = pc.Field(pc.String)
    variant = pc.Field(pc.String)
    namespace = pc.Field(pc.String)
    config = pc.Field(pc.String)
    label = pc.Field(pc.String)
    logo_url = pc.Field(pc.String)
    executable = pc.Field(pc.String)
    settings = pc.Array(Setting)
    docs = pc.Field(pc.String)
    settings_group_validation = pc.Array(SettingGroupValidation)
    commands = pc.Field(pc.String)

Consider storage mechanisms

Much as with SQLAlchemy, which maps Model subclasses directly to database Tables, project_config will map to files on disk. My current idea is to define a __storage_root__ attribute on the project_config.Model class, and to provide a .commit() method on each concrete property that recurses parents in an arbitrary hierarchy until the root is found. Root Models can be given a StorageHandler instance to actually process the .commit(). This makes the assumption that, to commit a child property a whole document must be committed as is the case with YAML. In future it may be possible to commit just a child property on its own.

e.g.

import project_config as pc

class Extractor(pc.Model):
    __metadata__ = {"additionalProperties": True}
    name = pc.Field(str, required=True)
    inherit_from = pc.Field(str)
    pip_url = pc.Field(str)
    variant = pc.Field(str)
    namespace = pc.Field(str)
    config = pc.Field(str)
    label = pc.Field(str)
    logo_url = pc.Field(str)
    executable = pc.Field(str)
    settings = pc.Array(Setting)
    docs = pc.Field(str)
    settings_group_validation = pc.Array(SettingGroupValidation)
    commands = pc.Field(str)


class Plugins(pc.Model):
    __metadata__ = {"additionalProperties": False}
    extractors = pc.Array(Extractor)


class ProjectFile(pc.Model):
    __storage_root__ = True
    __metadata__ = {
        "title": "A Project file.",
        "description": "Project is an Open Source project. Read more at https://github.com/kgpayne/project-config",
    }
    plugins = pc.Field(Plugins)


example_project_values = {
    "plugins": {
        "extractors": [
            {
                "name": "test-extractor",
                "inherit_from": "test-inherit-from",
                "pip_url": "test-pip-url",
            }
        ]
    }
}

project_file = ProjectFile.from_dict(values=example_project_values, storage_handler=pc.YAML())

project_file.commit()
# creates a file called `project_file.yml` in the current working dir (defaults set in the YAML storage handler)

extractor = project_file.plugins.extractors[0]
extractor.name = "new-extractor-name"
assert extractor._has_changes
extractor.commit()
# overwrites `project_file.yml` with updated extractor name

Add docs/description support to models and fields

In addition to field type, it would be useful (when generating JSONSchema and docs) to be able to annotate fields with their description:

import project_config as pc

class Extractor(pc.Model):
    __metadata__ = {"additionalProperties": True}
    name = pc.Field(
        str, required=True,
        description="The name of the plugin.",
        examples=["tap-jsonl"]
    )
    inherit_from = pc.Field(
        str, description="An existing plugin to inherit from."
    )
    pip_url = pc.Field(
        str, description="Pip url to install extractor from.",
        examples=[
            "target-jsonl",
            "git+https://gitlab.com/meltano/tap-facebook.git",
            "wtforms==2.2.1 apache-airflow==1.10.2"
        ]
    )
    variant = pc.Field(str, description="The variant of the plugin.")

would produce the following JSONSchema snippet:

"name": {
    "type": "string",
    "description": "The name of the plugin.",
    "examples": [
        "tap-jsonl"
    ]
},
"inherit_from": {
    "type": "string",
    "description": "An existing plugin to inherit from."
},
"pip_url": {
    "type": "string",
    "description": "The pip hosted package name or URL",
    "examples": [
        "target-jsonl",
        "git+https://gitlab.com/meltano/tap-facebook.git",
        "wtforms==2.2.1 apache-airflow==1.10.2"
    ]
},
"variant": {
    "type": "string",
    "description": "The variant of the plugin."
},

Add validation on instantiation and modification of properties

Currently project_config does not check or enforce typing (declared by Models) when creating Property classes or when attributes of those classes are modified. To ensure passed values conform to the Model declarations, and that values do not violate Model schemas after modification, we should add type-checking/validation (recursively) at those points.

e.g.

def test_required_field(example_project_dict):
    """This should raise on instantiation ideally.
    TODO: add validation on creation
    """
    # remove required field
    example_project_dict["plugins"]["extractors"][0].pop("name")
    # check if error is raised correctly
    with pytest.raises(pc.RequiredFieldError):
        pf = ProjectFile.from_dict(example_project_dict)
        # raises on access, because of lazy-loading
        pf.plugins.extractors[0].name

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.