Code Monkey home page Code Monkey logo

isogeo-api-py-minsdk's People

Contributors

dependabot-preview[bot] avatar guts avatar pyup-bot avatar simonsampere avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

isogeo-api-py-minsdk's Issues

Impossible de créer une condition sans licence associée

Pour reproduire:

# create object locally
condition = Condition(
    description="{} - {}".format(get_test_marker(), self.discriminator),
)

# add it to a metadata
condition_created = self.isogeo.metadata.conditions.create(
    metadata=self.fixture_metadata, condition=condition
)

Support metadata search in a group context

Metadata search is accessible from 2 contexts:

  • as authenticated client (application or user) in a global mode: https://api.isogeo.com/resources/search?
  • as authenticated client (application or user) within a group context: https://api.isogeo.com/groups/{group_uuid}/resources/search?

This second case is reserved to the applications using an user authentication: authorization code, implicit grant, legacy...

Until now, the SDK allows only a global search (case 1). It's about to handle search in a group context.

Téléchargement des données téléversées - Nettoyer les noms de fichier

Problème

Lors du téléversement de données sur Isogeo (voir doc), l'utilisateur est assez libre sur le nommage du lien et du fichier. Ainsi on trouve des caractères spéciaux potentiellement dans :

  • le titre du lien
  • le nom du fichier

Cela pose des problèmes au moment de l'enregistrement des fichiers sur le disque. De plus, le problème varie selon le système d'exploitation.

Solution

Ajouter une option pour retirer les caractères spéciaux des noms des fichiers à la méthode gérant le téléchargement des données téléversées.

Elle sera activée par défaut.

Unable to update a metadata if the passed object doesn't have certain attributes

To reproduce:

from isogeo_pysdk import Isogeo, Metadata


[..authentication steps..]

# get the metadata to update
metadata_to_update = isogeo.metadata.get(metadata_id=the_uuid_of_the_metadata_to_update)

# simulate a newly empty object
newly_metadata = Metadata(_id=metadata_to_update._id)

# just edit the title
newly_metadata.title = "OMG I've been updated!"

# try to update
isogeo.metadata.update(nely_metadata)
>>> 500: Internal Server Error

Let's add a warning and document that.

Help to handle invalid attributes

API is using some characters considered as invalid to build a Python object model, like hyphens (-). That's why there are some lines dedicated to prevent issues (example).

Implement a prettier way to handle these invalid attributes in following models:

  • Catalog
  • Metadata

Unable to create a feature-attribute if certains object attributes are missing

To reproduce:

from isogeo_pysdk import Isogeo, FeatureAttribute, Metadata


[..authentication steps..]

# get the metadata to update
metadata_to_update = isogeo.metadata.get(metadata_id=the_uuid_of_the_metadata_to_update)

# create local object
local_obj = FeatureAttribute(
    name="ATTRIBUTE_TECH_NAME",
    description="Field description"
)

# try to create
isogeo.metadata.attributes.create(local_obj)
>>> 500: Internal Server Error

Let's add a warning and document that.

La méthode metadata.exists ne fonctionne pas

La construction de l'URL est mauvaise :

# actuellement
https://api.qa.isogeo.com/resources/?_lang=fr123456789101112131415
# devrait être
https://api.qa.isogeo.com/resources/123456789101112131415?_lang=fr

Tags can contain duplicated labels

An application accessing a share can encounter tags with same labels. It makes the tags parsing fail #26 and the comboboxes do not reflect the exhaustivity.

Sample:

"contact:819c49300b9b4921a527629874b49122:93f847d99b01499bacde089f08359314": "SIG Brest m\u00e9tropole",
"contact:819c49300b9b4921a527629874b49122:9e61a3a3479e49f789c2e275bf396c8e": "SIG Brest m\u00e9tropole",

IsogeoChecker - Error raised with check_api_response

Using IsogeoChecker.check_api_response(), when API's response status_code is 401, the following error happens.

  File "C:\Users\SimonSAMPERE\Documents\GitHub\migrations-toolbelt\.venv\lib\site-packages\isogeo_pysdk\api\routes_keyword.py", line 520, in tagging
    req_check = checker.check_api_response(req_keyword_associate)
  File "C:\Users\SimonSAMPERE\Documents\GitHub\migrations-toolbelt\.venv\lib\site-packages\isogeo_pysdk\checker.py", line 127, in check_api_response
    response.json().get("error"),
  File "C:\Users\SimonSAMPERE\Documents\GitHub\migrations-toolbelt\.venv\lib\site-packages\requests\models.py", line 897, in json
    return complexjson.loads(self.text, **kwargs)
  File "C:\Users\SimonSAMPERE\AppData\Local\Programs\Python\Python37-32\lib\json\__init__.py", line 348, in loads
    return _default_decoder.decode(s)
  File "C:\Users\USERNAME\AppData\Local\Programs\Python\Python37-32\lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "C:\Users\SimonSAMPERE\AppData\Local\Programs\Python\Python37-32\lib\json\decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

This issue was found during the run of a migration script on more than 100 metadatas. It's related to this one.

Add capacity to build URL for CSW GetRecords

The method to build CSW GetRecord URL returns a GetRecordById:

"csw_getrec": {
"args": ("md_uuid_urn", "share_id", "share_token"),
"url": "https://services.api.isogeo.com/ows/s/"
"{share_id}/{share_token}?service=CSW"
"&version=2.0.2&request=GetRecordById"
"&id={md_uuid_urn}&elementsetname=full"
"&outputSchema=http://www.isotc211.org/2005/gmd",
},

Could be nice to add a helper to build GetRecords URL:

http://api.isogeo.com/services/ows/s/{share_id}/c/{cat_id}/{token}?service=CSW&version=2.0.2&request=GetRecords&ResultType=results&ElementSetName=brief&maxRecords=20&OutputFormat=application%2Fxml&OutputSchema=http%3A%2F%2Fwww.opengis.net%2Fcat%2Fcsw%2F2.0.2&namespace=xmlns(csw=http%3A%2F%2Fwww.opengis.net%2Fcat%2Fcsw%2F2.0.2)&TypeNames=csw%3ARecord&startPosition=1

Add share code to query parameters

Context

When a search is sent to Isogeo API, the response contains the query parameters. For example, filtering on the tests keyword:

{
    "tags": {...},
    "envelope": null,
    "query": {
        "_tags": [
            "keyword:isogeo:tests"
        ],
        "_terms": []
    },
    "results": [...],
    "offset": 0,
    "limit": 20,
    "total": 43
}

Problem

But this response does not contains share uuid used to filter (see: API doc [FR]).

To do

When option augment (see: package doc - in source code) is passed to the search method, the Python package should add the share ID to the returned query dict.

Add a method to compare 2 metadata

SPEC IN PROGRESS

Goal

Build an object allowing to compare 2 metadatas from Isogeo.
It calculates a similarity score based on passed options and criteria.

Input

It takes 2 metadata:

  • source_metadata
  • target_metadata

Criteria

TO DEFINE

Options

Ignoring some fields

It allows to ignore some fields (abstract, path...).

Applying a transformation

It allows to apply a transformation to a specific field.

Example:

{"path": str.lower()}

Output

It produces a spreadsheet with this structure:

  1. source_metadata_uuid
  2. source_metadata_title
  3. source_metadata_hash
  4. target_metadata_uuid
  5. target_metadata_title
  6. target_metadata_hash
  7. similarity_score
  8. comment

tags_as_dict option should apply to returned query

Context

Derived from #30 .

When a search is sent to Isogeo API, the response contains the query parameters. For example, filtering on the tests keyword:

{
    "tags": {...},
    "envelope": null,
    "query": {
        "_tags": [
            "keyword:isogeo:tests"
        ],
        "_terms": []
    },
    "results": [...],
    "offset": 0,
    "limit": 20,
    "total": 43
}

Idea

When option tags_as_dict (see: in source code) is passed to the search method, the Python package should transform query/_tags into dict.

Metadata update - Implement bulk method in API for batch editing

Context

Editing requests are too inefficient and destabilize the platform, even when they are sent asynchronously.

In the API source code, we can see there is a bulk method connected to the POST /resources routes:

https://github.com/isogeo/isogeo-api/blob/7fabd862d95dc117ea59cb2b6848c0c1df261c87/Api.V10/HttpRouteCollectionExtensions.cs#L161

Description of the bulk into the API

Source code

https://github.com/isogeo/isogeo-api/blob/master/Api.V10/Business/Bulk/Bulk.cs

https://github.com/isogeo/isogeo-api/blob/master/Api.V10/Business/Bulk/BulkAction.cs
https://github.com/isogeo/isogeo-api/blob/master/Api.V10/Business/Bulk/BulkTarget.cs
https://github.com/isogeo/isogeo-api/blob/master/Api.V10/Business/Bulk/BulkQuery.cs

Functional

The best way to see bulk in action is to use the batch tag/edit button in app.isogeo.com.


Implementation

Using the model of Prepared requests:

  • add related enums:
    • Actions
    • Targets
    • Ignored Reasons
  • add related models:
    • Bulk Request
    • Bulk Report
  • add a dedicated module called Bulk with:
    • a method prepare which can be called to add actions to perform later
    • a method send to launch the final request with the consolidated data

Script example

# get catalogs and keywords to tag metadatas with
cata = isogeo.catalog.get(
    workgroup_id={group_uuid},
    catalog_id={catalog_uuid},
)

kwd = isogeo.keyword.get(keyword_id={keyword_uuid})

# prepare JSON
data = [
    {
        "action": "add",
        "target": "catalogs",
        "query": {
            "ids": [
                "{metadata_X_uuid}",
                "{metadata_Y_uuid}",
                "{metadata_Z_uuid}",
            ],
        },
        "model": [cata.to_dict()],
    },
    {
        "action": "add",
        "target": "keywords",
        "query": {
            "ids": [
                "{metadata_X_uuid}",
                "{metadata_A_uuid}",
                "{metadata_G_uuid}",
            ],
        },
        "model": [kwd.to_dict()],
    },
]

# send
req = isogeo.post(
    url="https://api.qa.isogeo.com/resources/",
    json=data,
    headers=isogeo.header,
    proxies=isogeo.proxies,
    verify=isogeo.ssl,
    timeout=isogeo.timeout,
)

# print(req.status_code)

# -- END -------
isogeo.close()  # close session

Resources

FeatureAttributes - Champs manquants dans le modèle

length, scale et precision sont trois champs de description des attributs qui sont créés par le Scan FME mais n'apparaissent pas dans app.

Ces trois champs n'apparaissent pas dans la documentation de l'API et ne sont donc pas pris en compte dans la classe FeatureAttribute du sdk Python.

L'erreur Python suivante a été signalée en utilisant le module duplicator du migration-toolbelt, pour dupliquer une fiche de métadonnées décrivant un jeu de données vecteurs dont certains des attributs étaient décris par les champs length, scale ou precision.

Traceback (most recent call last):
  File "c:/$USERNAME/Documents/GitHub/migrations-toolbelt/scripts/jura/duplicate_script_jura.py", line 178, in <module>
    copymark_title=False
  File "c:\$USERNAME\documents\github\migrations-toolbelt\isogeo_migrations_toolbelt\duplicate\duplicator.py", line 534, in duplicate_into_other_group
    metadata_source=self.metadata_source, metadata_dest=md_dst
  File "C:\$USERNAME\Documents\GitHub\migrations-toolbelt\.venv\lib\site-packages\isogeo_pysdk\api\routes_feature_attributes.py", line 379, in import_from_dataset
    attribute = FeatureAttribute(**attribute)
TypeError: __init__() got an unexpected keyword argument 'length'

Au moment d'importer les attributs de la fiche source dans la fiche de destination créée, une requête est effectuée sur les attributs de la couche source. Pour chacun des attributs renvoyés par l'API, la classe FeatureAttribute est instanciée. Lorsque l'objet attribut renvoyé par l'API est doté d'une clef autre que _id, alias, dataType, description, language ou name l'erreur ci-dessus est signalée lors de l'instanciation.

Ressources :

  • UUID d'une fiche d'exemple pour length : 6c4b948c72c248bc95964201da654176
  • UUID d'une fiche d'exemple pour scale et precision : 0003fba1559945879f0205f8c652afa8
  • fichier log associé : migration_jura.log

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.