Code Monkey home page Code Monkey logo

ging / fiware-draco Goto Github PK

View Code? Open in Web Editor NEW
19.0 14.0 15.0 9.51 MB

The Draco Generic Enabler is an alternative data persistence mechanism for managing the history of context. It is based on Apache NiFi and is a dataflow system based on the concepts of flow-based programming. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic and also offers an intuitive graphical interface

Home Page: https://fiware-draco.readthedocs.io/en/latest/

License: Apache License 2.0

Java 96.60% Shell 2.58% Dockerfile 0.58% HTML 0.23%
fiware fiware-draco nifi processor ngsi data-flow flowfile

fiware-draco's Introduction

FIWARE Draco

Codacy Badge License Docker badge Support badge
Documentation badge CI Coverage Status Known Vulnerabilities Status CII Best Practices

Table of Contents

What is Draco?

This project is part of FIWARE, as part of the Core Context Management Chapter .

Draco is a is an easy to use, powerful, and reliable system to process and distribute data. Internally, Draco is based on Apache NiFi, NiFi is a dataflow system based on the concepts of flow-based programming. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic. It was built to automate the flow of data between systems. While the term 'dataflow' is used in a variety of contexts, we use it here to mean the automated and managed flow of information between systems.

๐Ÿ“š Documentation ๐ŸŽ“ Academy ๐Ÿณ Docker Hub ๐ŸŽฏ Roadmap

Terminology

In order to talk about Draco, there are a few key terms that readers should be familiar with. We will explain those NiFi-specific terms here, at a high level.

FlowFile: Each piece of "User Data" (i.e., data that the user brings into NiFi for processing and distribution) is referred to as a FlowFile. A FlowFile is made up of two parts: Attributes and Content. The Content is the User Data itself. Attributes are key-value pairs that are associated with the User Data.

Processor: The Processor is the NiFi component that is responsible for creating, sending, receiving, transforming, routing, splitting, merging, and processing FlowFiles. It is the most important building block available to NiFi users to build their dataflows.

Why use Draco?

Draco is designed to run specific set of processors and templates for persistence context data to multiple sinks.

Current stable release is able to persist the following sources of data in the following third-party storages:

  • NGSI-like context data in:
    • MySQL, the well-known relational database manager.
    • MongoDB, the NoSQL document-oriented database.
    • PostgreSQL, the well-known relational database manager.
    • Cassandra, Distributed database.
    • Carto, for geospatial Data
    • HDFS, High Distributed filesystem.
    • DynamoDB, A cloud key-value Amazon DB.

Draco place in FIWARE architecture

Draco plays the role of a connector between Orion Context Broker (which is a NGSI source of data) and many external and FIWARE storages like MySQL, MongoDB

FIWARE architecture

How to Deploy?

The most easy way to deploy Draco is running the container available on DockerHub.

Start a container for this image by typing in a terminal:

 $ docker run --name draco -p 8443:8443 -p 5050:5050 -d ging/fiware-draco

However if you want to have a custom installation please go to the Installation and Administration Guide at readthedocs.org

Usage: Overview

The best way to start with Draco is following the Quick Start Guide found at readthedocs.org and it provides a good documentation summary (Draco).

Nevertheless, both the Installation and Administration Guide also found at readthedocs.org cover more advanced topics.

The Processors Catalogue completes the available documentation for Draco (Draco).

Training courses

Academy Courses

Some lessons on Draco Fundamentals will be offered soon in the FIWARE Academy .

Examples

Several examples are provided to facilitate getting started with GE. They are hosted in the official documentation at Read the Docs.

Testing

In order to test the code:

$mvn clean test -Dtest=Test* cobertura:cobertura coveralls:report -Padd-dependencies-for-IDEA

Quality Assurance

This project is part of FIWARE and has been rated as follows:

  • Version Tested: TBD
  • Documentation: TBD
  • Responsiveness: TBD
  • FIWARE Testing: TBD

Roadmap

The list of features that are planned for the subsequent release are available in the ROADMAP file.

Maintainers

@anmunoz.

Licensing

Draco Except as otherwise noted this software is licensed under the Apache License, Version 2.0 Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Reporting issues and contact information

Any doubt you may have, please refer to the Draco Core Team.

fiware-draco's People

Contributors

anmunoz avatar codacy-badger avatar dependabot[bot] avatar jason-fox avatar javicond3 avatar josevirseda avatar pooja1pathak avatar snyk-bot avatar sonsoleslp avatar veronicapp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fiware-draco's Issues

Draco insert data to PostgreSQL for LD

Following the tutorial https://github.com/FIWARE/tutorials.Historic-Context-NIFI/tree/NGSI-LD, I cannot store data into PostgreSQL.
Added the template for the tutorial: POSTGRESQL-TUTORIAL, enter the password and changed the NGSI Version to ld. next subscribing to Context Changes and orion-ld is running
The following log is from the PostgreSQL

The files belonging to this database system will be owned by user "postgres".
This user must also own the server process.
The database cluster will be initialized with locale "en_US.utf8".
The default database encoding has accordingly been set to "UTF8".
The default text search configuration will be set to "english".
Data page checksums are disabled.
fixing permissions on existing directory /var/lib/postgresql/data ... ok
creating subdirectories ... ok
selecting dynamic shared memory implementation ... posix
selecting default max_connections ... 100
selecting default shared_buffers ... 128MB
selecting default time zone ... Etc/UTC
creating configuration files ... ok
running bootstrap script ... ok
performing post-bootstrap initialization ... ok
syncing data to disk ... ok
Success. You can now start the database server using:
    pg_ctl -D /var/lib/postgresql/data -l logfile start
initdb: warning: enabling "trust" authentication for local connections
You can change this by editing pg_hba.conf or using the option -A, or
--auth-local and --auth-host, the next time you run initdb.
waiting for server to start....2021-07-19 16:51:41.389 UTC [48] LOG:  starting PostgreSQL 13.3 (Debian 13.3-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
2021-07-19 16:51:41.394 UTC [48] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
2021-07-19 16:51:41.409 UTC [49] LOG:  database system was shut down at 2021-07-19 16:51:40 UTC
2021-07-19 16:51:41.416 UTC [48] LOG:  database system is ready to accept connections
 done
server started
/usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*
2021-07-19 16:51:41.573 UTC [48] LOG:  received fast shutdown request
waiting for server to shut down....2021-07-19 16:51:41.582 UTC [48] LOG:  aborting any active transactions
2021-07-19 16:51:41.583 UTC [48] LOG:  background worker "logical replication launcher" (PID 55) exited with exit code 1
2021-07-19 16:51:41.584 UTC [50] LOG:  shutting down
2021-07-19 16:51:41.675 UTC [48] LOG:  database system is shut down
 done
server stopped
PostgreSQL init process complete; ready for start up.
2021-07-19 16:51:41.804 UTC [1] LOG:  starting PostgreSQL 13.3 (Debian 13.3-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
2021-07-19 16:51:41.804 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
2021-07-19 16:51:41.804 UTC [1] LOG:  listening on IPv6 address "::", port 5432
2021-07-19 16:51:41.815 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
2021-07-19 16:51:41.833 UTC [67] LOG:  database system was shut down at 2021-07-19 16:51:41 UTC
2021-07-19 16:51:41.844 UTC [1] LOG:  database system is ready to accept connections
2021-07-19 16:56:30.298 UTC [80] ERROR:  column "recvtime" of relation "urn_ngsi_ld_device_tractor003" already exists
2021-07-19 16:56:30.298 UTC [80] STATEMENT:  Alter table openiot.urn_ngsi_ld_Device_tractor003 ADD COLUMN recvtime text, ADD COLUMN entityid text, ADD COLUMN entitytype text, ADD COLUMN controllingasset text, ADD COLUMN supportedprotocol text, ADD COLUMN supportedprotocol_observedat text, ADD COLUMN statusdescription text, ADD COLUMN statusdescription_observedat text, ADD COLUMN temperature text, ADD COLUMN temperature_unitcode text, ADD COLUMN description text, ADD COLUMN description_observedat text, ADD COLUMN controlledproperty text, ADD COLUMN controlledproperty_observedat text, ADD COLUMN location text, ADD COLUMN category text, ADD COLUMN category_observedat text, ADD COLUMN status text, ADD COLUMN status_observedat text
2021-07-19 16:56:30.307 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.307 UTC [80] STATEMENT:  create schema if not exists openiot
2021-07-19 16:56:30.312 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.312 UTC [80] STATEMENT:  create schema if not exists openiot
2021-07-19 16:56:30.315 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.315 UTC [80] STATEMENT:  create schema if not exists openiot
2021-07-19 16:56:30.318 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.318 UTC [80] STATEMENT:  Insert into openiot.urn_ngsi_ld_Device_tractor003 (recvTime,entityId,entityType,controllingAsset,supportedProtocol,supportedProtocol_observedAt,statusDescription,statusDescription_observedAt,temperature,temperature_unitCode,description,description_observedAt,controlledProperty,controlledProperty_observedAt,location,category,category_observedAt,status,status_observedAt) values ('07/19/2021 16:56:29','urn:ngsi-ld:Device:tractor003','Tractor','urn:ngsi-ld:Building:tower003','ul20','2021-07-19T16:56:29.000Z','IDLE','2021-07-19T16:56:29.000Z','{"@value":null,"@type":"Intangible"}','CEL','Temperature Sensor','2021-07-19T16:56:29.000Z','temperature','2021-07-19T16:56:29.000Z','{"coordinates":[13.3598,52.5165],"type":"Point"}','sensor','2021-07-19T16:56:29.000Z','0','2021-07-19T16:56:29.000Z')
2021-07-19 16:56:30.327 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.327 UTC [80] STATEMENT:  Insert into openiot.urn_ngsi_ld_Device_tractor002 (recvTime,entityId,entityType,controllingAsset,supportedProtocol,supportedProtocol_observedAt,statusDescription,statusDescription_observedAt,temperature,temperature_unitCode,description,description_observedAt,controlledProperty,controlledProperty_observedAt,location,category,category_observedAt,status,status_observedAt) values ('07/19/2021 16:56:29','urn:ngsi-ld:Device:tractor002','Tractor','urn:ngsi-ld:Building:barn002','ul20','2021-07-19T16:56:29.000Z','IDLE','2021-07-19T16:56:29.000Z','{"@value":null,"@type":"Intangible"}','CEL','Temperature Sensor','2021-07-19T16:56:29.000Z','temperature','2021-07-19T16:56:29.000Z','{"coordinates":[13.3698,52.5163],"type":"Point"}','sensor','2021-07-19T16:56:29.000Z','0','2021-07-19T16:56:29.000Z')
2021-07-19 16:56:30.328 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.328 UTC [80] STATEMENT:  Insert into openiot.urn_ngsi_ld_Device_tractor004 (recvTime,entityId,entityType,controllingAsset,supportedProtocol,supportedProtocol_observedAt,statusDescription,statusDescription_observedAt,temperature,temperature_unitCode,description,description_observedAt,controlledProperty,controlledProperty_observedAt,location,category,category_observedAt,status,status_observedAt) values ('07/19/2021 16:56:29','urn:ngsi-ld:Device:tractor004','Tractor','urn:ngsi-ld:Building:farm002','ul20','2021-07-19T16:56:29.000Z','IDLE','2021-07-19T16:56:29.000Z','{"@value":null,"@type":"Intangible"}','CEL','Temperature Sensor','2021-07-19T16:56:29.000Z','temperature','2021-07-19T16:56:29.000Z','{"coordinates":[13.3127,52.4893],"type":"Point"}','sensor','2021-07-19T16:56:29.000Z','0','2021-07-19T16:56:29.000Z')
2021-07-19 16:56:30.329 UTC [80] ERROR:  current transaction is aborted, commands ignored until end of transaction block
2021-07-19 16:56:30.329 UTC [80] STATEMENT:  Insert into openiot.urn_ngsi_ld_Device_tractor001 (recvTime,entityId,entityType,controllingAsset,supportedProtocol,supportedProtocol_observedAt,statusDescription,statusDescription_observedAt,temperature,temperature_unitCode,description,description_observedAt,controlledProperty,controlledProperty_observedAt,location,category,category_observedAt,status,status_observedAt) values ('07/19/2021 16:56:29','urn:ngsi-ld:Device:tractor001','Tractor','urn:ngsi-ld:Building:farm001','ul20','2021-07-19T16:56:29.000Z','IDLE','2021-07-19T16:56:29.000Z','{"@value":null,"@type":"Intangible"}','CEL','Temperature Sensor','2021-07-19T16:56:29.000Z','temperature','2021-07-19T16:56:29.000Z','{"coordinates":[13.3505,52.5144],"type":"Point"}','sensor','2021-07-19T16:56:29.000Z','0','2021-07-19T16:56:29.000Z')

It seems for some reason Draco did not create the schema to store the data
Running \dn with postgresql-client returns:

  List of schemas
  Name  |  Owner   
--------+----------
 public | postgres
(1 row)

curl -X GET 'http://localhost:9090/nifi-api/system-diagnostics'

{"systemDiagnostics":{"aggregateSnapshot":{"totalNonHeap":"216 MB","totalNonHeapBytes":226492416,"usedNonHeap":"203.49 MB","usedNonHeapBytes":213375160,"freeNonHeap":"12.51 MB","freeNonHeapBytes":13117256,"maxNonHeap":"-1 bytes","maxNonHeapBytes":-1,"totalHeap":"497.5 MB","totalHeapBytes":521666560,"usedHeap":"143.18 MB","usedHeapBytes":150139712,"freeHeap":"354.32 MB","freeHeapBytes":371526848,"maxHeap":"497.5 MB","maxHeapBytes":521666560,"heapUtilization":"29.0%","availableProcessors":8,"processorLoadAverage":1.2744140625,"totalThreads":103,"daemonThreads":28,"uptime":"01:03:34.973","flowFileRepositoryStorageUsage":{"freeSpace":"92.44 GB","totalSpace":"221.04 GB","usedSpace":"128.6 GB","freeSpaceBytes":99257556992,"totalSpaceBytes":237338017792,"usedSpaceBytes":138080460800,"utilization":"58.0%"},"contentRepositoryStorageUsage":[{"identifier":"default","freeSpace":"92.44 GB","totalSpace":"221.04 GB","usedSpace":"128.6 GB","freeSpaceBytes":99257556992,"totalSpaceBytes":237338017792,"usedSpaceBytes":138080460800,"utilization":"58.0%"}],"provenanceRepositoryStorageUsage":[{"identifier":"default","freeSpace":"92.44 GB","totalSpace":"221.04 GB","usedSpace":"128.6 GB","freeSpaceBytes":99257556992,"totalSpaceBytes":237338017792,"usedSpaceBytes":138080460800,"utilization":"58.0%"}],"garbageCollection":[{"name":"PS Scavenge","collectionCount":77,"collectionTime":"00:00:01.033","collectionMillis":1033},{"name":"PS MarkSweep","collectionCount":4,"collectionTime":"00:00:00.438","collectionMillis":438}],"statsLastRefreshed":"06:33:21 GMT","versionInfo":{"niFiVersion":"1.13.0","javaVendor":"Oracle Corporation","javaVersion":"1.8.0_191","osName":"Linux","osVersion":"5.10.0-7-amd64","osArchitecture":"amd64","buildTag":"nifi-1.13.0-RC4","buildRevision":"3bc6a12","buildBranch":"UNKNOWN","buildTimestamp":"02/10/2021 19:15:44 GMT"}}}}```

DRACO_VERSION=1.3.6

[SHOULD] Every GE should have tutorial information

At the moment Draco has no information in the Academy and does not minimum Training section requirements for a Full GE. This is a placeholder to ensure a simple Draco hello world gets added to the Step-by-Step tutorials at some point

  • Presence of a tutorial is a SHOULD requirement it will be upgraded in the future

An incubated GE should be working to fulfilling these requirements

Fiware Draco - Subscription works only for the first payload. Only the first payload is saved in historical mongodb

I am trying to save historical context data in Mongo, but without success. Only the first payload sent to Draco is saved to MongoDB for historical data, but Mongo does not react to attribute updates. Versions used for the test: Orion-LD version 0.8.0 and Orion 3.4, Mongo version 4.4, Draco version 1.3.6. I tested it also with the 3.4 version of Mongo and the behavior is the same.
When I tested the processors for Mysql database there were no problems.

Can you, please, help me to fix a problem?

Below are the steps I performed:

Create a Draco subscription:

curl --location --request POST 'http://localhost:1026/v2/subscriptions' \
--header 'Fiware-Service: test' \
--header 'Fiware-ServicePath: /openiot' \
--header 'Content-Type: application/json' \
--data-raw '{
  "description": "Notify Draco of all context changes",
  "subject": {
    "entities": [
      {
        "idPattern": ".*"
      }
    ]
  },
  "notification": {
    "http": {
      "url": "http://10.0.0.5:5050/v2/notify"
    }
  },
  "throttling": 0
}'

Create an entity:

curl --location --request POST 'http://localhost:1026/v2/entities' \
--header 'Fiware-Service: test' \
--header 'Fiware-ServicePath: /openiot' \
--header 'Content-Type: application/json' \
--data-raw ' {
      "id":"urn:ngsi-ld:Product:0102", "type":"Product",
      "name":{"type":"Text", "value":"Lemonade"},
      "size":{"type":"Text", "value": "S"},
      "price":{"type":"Integer", "value": 99}
}'

Overwrite the value of an attribute value:

curl --location --request PUT 'http://localhost:1026/v2/entities/urn:ngsi-ld:Product:0102/attrs' \
--header 'Fiware-Service: test' \
--header 'Fiware-ServicePath: /openiot' \
--header 'Content-Type: application/json' \
--data-raw '{
    "price":{"type":"Integer", "value": 110}
}'

LISTEN_HTTP PROCESSOR:
1

NGSITOMONGO PROCESSOR:
2

Template:
3

MongoDB:
mongo_db

Support for Dynamic Collection Prefix Names in NGSIToMongo

I would like to request support for dynamic collection prefix names in the NGSIToMongo. Currently, the NGSIToMongo requires a static prefix collection name to be defined in the configuration, doesn't support expession language. However, I need the ability to generate collection names dynamically based on attribute values from the incoming context data.

Is it possible for you to include support for dynamic collection prefix names using expession language in NGSIToMongo in the next few weeks? If not, can you please provide instructions on how I can add this functionality to NGSIToMongo or how to set up a custom configuration. Any help is appreciated.

Thank you in advance.

Column-Like Storing

Greetings,

from the Draco Documentation over at readthedocs it states that Column-Like storing is available for NGSIToMysql-Processor. However, when running draco via Docker in latest version, the "Attribute Persistence"-Option "Column" ist not available.
From the code you can see, that the option "column" is not added to the allowable values:

Is this intentional? If this feature is not yet implemented, shouldn't that be stated in the documentation?

Kind regards

Provide ARM64 docker Images for Draco

Related to an open Apache NIFI ticket: https://issues.apache.org/jira/browse/NIFI-9177

Once the upstream issue has been resolved, Draco should also be configured to create a working arm64 image. I am currently getting the following error on start up on Apple Silicon M1 when running the Docker Image. (tested with 2.1.0 and earlier versions)

2021-08-29 04:36:48,799 WARN [main] org.apache.nifi.web.server.JettyServer Failed to start web server... 
shutting down.
2021-08-29 04:36:48,799 WARN [main] org.apache.nifi.web.server.JettyServer Failed to start 
web server... shutting down.java.io.IOException: Function not implemented at sun.nio.fs.LinuxWatchService.<init>(LinuxWatchService.java:64) at
sun.nio.fs.LinuxFileSystem.newWatchService(LinuxFileSystem.java:47) at 
org.apache.nifi.nar.NarAutoLoader.start(NarAutoLoader.java:68) at org.apache.nifi.web.server.JettyServer.start(JettyServer.java:1229) at org.apache.nifi.NiFi.<init>(NiFi.java:159) at 
org.apache.nifi.NiFi.<init>(NiFi.java:71) at org.apache.nifi.NiFi.main(NiFi.java:303)
2021-08-29 04:36:48,830 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web server...

Create my own template

Hi Team,

I need to create my own template in draco nifi, for that I have done below changes:

  1. created a template file as nifi-ngsi-resources/docker/templates/text.xml
  2. Added processor as nifi-ngsi-bundle/nifi-ngsi-processors/src/main/java/org/apache/nifi/processors/ngsi/test.java
  3. Added entry for test in nifi-ngsi-bundle/nifi-ngsi-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor

Now, when I create docker image using commad sudo docker build -f ./Dockerfile -t draco . and run this image, a new template for test is showing in GUI of /nifi but when I add it, an error message is shown on GUI : org.apache.nifi.processors.ngsi.test is not known to this NiFi instance.

Please guide me if I am doing anything wrong or missed anything to create and use my own template.

Draco NGSIToCassandra processor giving error

I am trying to persist data coming from Orion Context Broker to Cassandra by using ging/fiware-draco dockerhub image.

The moment I drag and drop the NGSIToCassandraError it throws an error as shown in the image. I tried the older versions of docker image with no help. Older versions like 1.3.1 or 1.3.4 does not throw this error, but they have their own bugs. None works as stable as NGSIToPostgreSQL.

draco_error

My question is: Is NGSIToCassandra used in production at all? Should i try to use Cygnus -> Kafka -> Cassandra approach?
Any help is appreciated.

MongoDB only holds last update of the data

Hi,

I just started using FIWARE and I'm having trouble with the data I want to persist on MongoDB using Draco. I am using a Python script that creates my context data with a specific id and then updates certain attributes of this data. While I can't see any problem when I observe the JSON data from this script, HTTP requests and Apache NiFi flow, MongoDB keeps only the last updated version of this data in the related collection. Because of this, I cannot perform time series functions. However I can see the "credate" and "moddate" metadatas in attributes. But I need different records for every update.

I've gone through all the docs but either couldn't find the solution or am I missing something in the logic. How should I go about the issue?

Edit: Here is the related stackoverflow post with some details: https://stackoverflow.com/questions/71218728/how-to-create-different-record-for-every-update-in-fiware-draco

BR,
Y.

Droco data in mysql garbled issue

{
"type": "Room",
"id": "DC_S1-D41",
"temperature": {
"value": "ๆ‰“้บปๅฐ†1",
"type": "Number",
"metadata": {}
}
}

This is the entity in ngsi๏ผŒI listened the attr of value
When the data of the table data enters mysql
But there is a issue of garbled data like this:
image

How can i do to solve this iesue
Thank you guards,please me
Or tell me your skype number
And we can communicate it

Missing backends in comparison with Cygnus

I was comparing Fiware-draco backends with Fiware-Cygnus backends and found that following backends are missing:

  1. Arcgis
  2. Elasticsearch
  3. Kafka
  4. Orion

I would like to contribute to develop these backends.

Please provide your opinion so that I can start working on it.

Login via KeyRock returns error

Hi,

I am trying to login via KeyRock in Draco, but after configuring the app in KeyRock and adding the corresponding properties in Draco's config file.

Upon entering the Draco interface, I am redirected to keyRock and logged in. Then it redirects to the draco again and I get this error:

image

2022-06-23 06:42:32,651 ERROR [NiFi Web Server-79] o.apache.nifi.web.api.OIDCAccessResource Unable to exchange authorization for ID token: Connection refused (Connection refused)

Attached other files:

nifi.properties

nifi.security.user.oidc.discovery.url=http://localhost:3005/idm/applications/tutorial-dckr-site-0000-xpresswebapp/.well-known/openid-configuration
nifi.security.user.oidc.connect.timeout=5 secs
nifi.security.user.oidc.read.timeout=5 secs
nifi.security.user.oidc.client.id=tutorial-dckr-site-0000-xpresswebapp
nifi.security.user.oidc.client.secret=tutorial-dckr-site-0000-clientsecret
nifi.security.user.oidc.preferred.jwsalgorithm=
nifi.security.user.oidc.additional.scopes=
nifi.security.user.oidc.claim.identifying.user=
nifi.security.user.oidc.fallback.claims.identifying.user=
nifi.security.user.oidc.truststore.strategy=JDK

KeyRock application

image

KeyRock logs

image

What I can be doing wrong?

Thanks!

[SHOULD] Work towards fulfilling the requirements for a full GE

The following MUST requirements for full GE status are not currently satisfied:

Missing Badges:

  • README Badge - CI Build/Unit Test (Travis)

Missing items within the README

  • README - QA Section
  • README - Training Section
  • README - ToC
  • README - How to Deploy text present
  • README - How to Run Tests text present
  • README - Main API Walkthrough text present
  • README - Access to the advanced API and Documentation text present

Missing Release Item

  • Releases uses SemVer
    (note the existing FIWARE_7.5 tag satisfies alignment for FIWARE releases)

Missing Docker Items

  • DOCKER - SemVer Image Available
  • DOCKER - FIWARE Image Available

No CREDITS file

An incubated enabler should be working to fulfilling these requirements

SHOULD requirement from the TSC

NGSIToMongo 2.0.0 failed

I encountered the following error in NGSIToMongo 2.0.0 (Draco 2.0.0) when using MONGO-TUTORIAL.

draco-error

2022-01-22 23:32:16,079 ERROR [Timer-Driven Process Thread-9] o.a.nifi.processors.ngsi.NGSIToMongo NGSIToMongo[id=45007d62-fe1d-3e62-e8da-1a3ed5694475] Failed to process session due to java.lang.IllegalAccessError: tried to access class org.bson.BSON from class com.mongodb.util.JSONCallback; Processor Administratively Yielded for 1 sec: java.lang.IllegalAccessError: tried to access class org.bson.BSON from class com.mongodb.util.JSONCallback
java.lang.IllegalAccessError: tried to access class org.bson.BSON from class com.mongodb.util.JSONCallback
        at com.mongodb.util.JSONCallback.objectDone(JSONCallback.java:145)
        at org.bson.BasicBSONCallback.arrayDone(BasicBSONCallback.java:139)
        at com.mongodb.util.JSONParser.parseArray(JSON.java:629)
        at com.mongodb.util.JSONParser.parse(JSON.java:225)
        at com.mongodb.util.JSONParser.parse(JSON.java:157)
        at com.mongodb.util.JSON.parse(JSON.java:99)
        at com.mongodb.util.JSON.parse(JSON.java:80)
        at org.apache.nifi.processors.ngsi.ngsi.aggregators.MongoAggregator$RowAggregator.createDocWithMetadata(MongoAggregator.java:101)
        at org.apache.nifi.processors.ngsi.ngsi.aggregators.MongoAggregator$RowAggregator.aggregate(MongoAggregator.java:78)
        at org.apache.nifi.processors.ngsi.AbstractMongoProcessor.persistFlowFile(AbstractMongoProcessor.java:333)
        at org.apache.nifi.processors.ngsi.NGSIToMongo.onTrigger(NGSIToMongo.java:78)
        at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
        at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1272)
        at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:214)
        at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:103)
        at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

Reproduce

docker-compose.yml

version: "3.0"
services:
  orion:
    image: fiware/orion:3.4.0
    depends_on:
      - mongo-db
    ports:
      - 1026:1026
    command: -dbhost mongo-db -noCache

  draco:
    image: ging/fiware-draco:2.0.0
    environment:
      - NIFI_WEB_HTTP_PORT=9090
    ports:
      - 9090:9090

  mongo-db:
    image: mongo:4.4

Create a subscription

#!/bin/sh
set -eu
CB=192.168.0.1
curl -iX POST "http://${CB}:1026/v2/subscriptions" \
  -H 'Content-Type: application/json' \
  -H 'fiware-service: openiot' \
  -H 'fiware-servicepath: /' \
-d '{
  "subject": {
    "entities": [
      {
        "idPattern": ".*"
      }
    ]
  },
  "notification": {
    "http": {
      "url": "http://draco:5050/v2/notify"
    }
  }
}'

Update an entity

#!/bin/bash
set -eu
CB=192.168.0.1
for i in {0..9}
do
  curl -iX POST "http://${CB}:1026/v2/entities?options=keyValues,upsert" \
  -H 'Content-Type: application/json' \
  -H 'fiware-service: openiot' \
  -H 'fiware-servicepath: /' \
  -d "{\"id\": \"I\", \"temperature\":${RANDOM}}"
  sleep 1
done

Refused Access UI

Using via docker, does not allow access to the user interface.
For this I had to add in the environment variables:

   - NIFI_WEB_HTTP_HOST = 0.0.0.0

regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.