Code Monkey home page Code Monkey logo

frost-server's People

Contributors

abourdon avatar apm1467 avatar clemenschuaccso avatar dependabot-preview[bot] avatar dependabot[bot] avatar ehj-52n avatar fabianwilms avatar henotu avatar hylkevds avatar image357 avatar jinnerbichler avatar m4ci3k2 avatar michaeltrip avatar mjacoby avatar mlechner avatar nlswntr avatar pbaumard avatar phertweck avatar rhzg avatar selimnairb avatar sirbubbls avatar solingen-digital avatar spezzi avatar t-hellmund avatar tobilarscheid avatar volcan01010 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

frost-server's Issues

Add change log for future releases

Version 1.1 was released last week. However, I don't see a change log documenting what has changed. One way to do this might be to handle releases using feature and develop branches. The change log can be updated for each feature branch; feature branches could be merged into develop when they are complete. Then, when develop is merged into master for a release, the change log will automatically contain all changes in the release (though it may need to be updated to collect all changes into the specific number of the next release). Happy to help in any way I can.

MQTT - 2 Messages per 1 publish on subscriber side

If one sends (publish) 1 observation over MQTT on "v1.0/Datastreams(id)/Observations" I receive 2 messages (JSON Objects) on the subscriber side.

publish on "v1.0/Datastreams(id)/Observations":

{
„result“: 23.57
}

Subscription on "v1.0/Datastreams(id)/Observations" yields:

  1. {
    „result“: 23.57
    }

and 2.
{
"phenomenonTime" : "2017-04-24T11:59:26.079Z",
"resultTime" : null,
"result" : 23.57,
"[email protected]" : "http://xx.xxx.xxx.xxx/SensorThingsService/v1.0/Observations(25460)/Datastream",
"[email protected]" : "http://xx.xxx.xxx.xxx /SensorThingsService/v1.0/Observations(25460)/FeatureOfInterest",
"@iot.id" : 25460,
"@iot.selfLink" : "http://xx.xxx.xxx.xxx /SensorThingsService/v1.0/Observations(25460)"
}

So it seems that the message broker itself pushes the message to the subscriber and at the same time the SensorThingsServer sends a SensorThingsAPI compliant JSON object as well.

Following figure 4 on page 83 of the OGC specifiaction I would expect only on MQTT message. Is there a reason for this in your application?

Best regards

Java version of the SensorThings API entity model

Hi,

I’ve just created a library that exposes a Java version of the SensorThings API entity model: https://github.com/StoreConnect/sensorthings-api-model

As you may expect, idea is to provide a reusable library of the SensorThings API entity model for Java users.

Mainly developed for SensorThings API’s clients, I wonder if you will be interested to include it to the FROST-Server. This way, this library could be the bridge between a SensorThings API server and a SensorThins API client.

Note this model tries to be as tiny, simple and reusable as possible. From now, it covers:

  • The default Sensing entities
  • The SensorThings MultiDatastream extension

In a future release, it will also cover the SensorThings Data Array extension.

Any remark would be greatly appreciated :-) If you are interested we could imagine to iterate together to fit to your needs.

Regards,
Aurélien

Running on Wildfly

I would like to run the SensorThingsServer on Wildfly 10.
I saw that the configuration is partially in context.xml which makes running the application Tomcat specific.

New stable release

Hi,

Could it be possible to generate a new stable release that integrates latest changes (#37, #38)?

This way we could use a stable Docker image (i.e., linked to this stable version). Indeed, there is only a snapshot version in the Docker repository.

Thanks,
Aurélien

MultiDatastream Issue and Query

Issue
While working on MultiDatastream, we noticed that we were not able to get the observations for the MultiDatastream.

http://apiserver:8080/SensorThingsService/v1.0/Observations(7)/MultiDatastream - working
http://apiserver:8080/SensorThingsService/v1.0/MultiDatastreams(3)/Observations - not working

Analysis:
PathSqlBuilder class helps to bulid "SQLQuery" for "Multidatastream/Observation" using "queryMultiDatastreams" method.

Here SQLQuery uses "Datastreams" instead "MultiDatastreams"

Fix:
To get correct MultiDatastreams/Observations, build SQLQuery innerJoin using "MultiDatastreams"

Query:
The MultiDatastream observation looks as follows:
["2017-04-12","21",["20","22"]]

1> Is there a way to get the 2nd value in the result array?
Observations(2815)/result[1]/$value

2> Is it possible to use the value with comparison operator?
Observations(2815)?$filter = result[1] eq 21

Elastic Search for SensorThingsServer?

Hi,

since the amount of sensordata is growing fast the question raises if there are plans to implemented other data storages (i.e. Elastic Search) in the server or if you stick with PostGIS?

best regards

Publish Docker image to the public Docker hub

Hi,

I would like to use the SensorThingsServer through its Docker image without be forced to create it from sources, but by pulling it from an official SensorThings' Docker repository (by using the public Docker hub for instance).

Indeed, currently we have to:

  1. Clone project's sources
  2. Build project's sources
  3. Use the existing Dockerfile that copies the generated war file from 2.

Actually this workflow works fine for SensorThingsServer's developers but not for users. Indeed, users would like to you image without being forced to create it.

Aurélien

Filter Observation by Time Interval

We are looking for support to filter Observation for a time interval (phenomeonTimeStart and phenomenonTimeEnd time of an observation).

We need record within the given filter value, but not.

Below is my url:
http://localhost:8080/SensorThingsService/v1.0/Datastreams(32)/Observations?$filter=phenomenontime ge 2017-08-14T02:45:00.000Z and phenomenontime lt 2017-09-30T01:10:00.000Z&$select=result,phenomenonTime

Result :

{
"@iot.count": 2,
"value": [
{
"phenomenonTime": "2017-08-15T03:00:00.000Z/2017-08-15T04:00:00.000Z",
"result": 62
},
{
"phenomenonTime": "2017-08-15T01:42:00.000Z/2017-08-15T02:55:00.000Z",
"result": 67
}
]
}

Original Data in Table:
observationdata
ng…]()

Below Observation records available for Datastream Id 32 :
{
"phenomenonTime": "2017-09-30T01:00:00.000Z/2017-09-30T02:00:00.000Z",
"result": 58
},
{
"phenomenonTime": "2017-08-15T03:00:00.000Z/2017-08-15T04:00:00.000Z",
"result": 62
},
{
"phenomenonTime": "2017-08-15T01:42:00.000Z/2017-08-15T02:55:00.000Z",
"result": 67
},
{
"phenomenonTime": "2017-08-14T01:42:00.000Z/2017-08-14T02:55:00.000Z",
"result": 71
}

Keycloak branch: UNKNOWN_CLIENT, client_id is null?

Hello,

First of all, thank you for providing a branch that supports keycloak out of the box. I'm pretty new to keycloak so please excuse my ignorance if this is a silly problem.

I've been trying to configure the application with my keycloak server, however I'm having some trouble accessing any routes after successfully logging in, getting http error 403 everytime along with the message: Access to the specified resource has been forbidden.

I'm running the SensorThings Server in Tomcat 8.0 (with Keycloak 3.1.0 adapters), so I checked the logs to try and understand what went wrong. It seems to be generating the following message:
ERROR: {"error":"unauthorized_client","error_description":"UNKNOWN_CLIENT: Client was not identified by any client authenticator"}

I've created a realm named SensorThings and a client in this realm named SensorThingsService and I thought maybe I set the client id wrong in my keycloak.json, which looks like this:

{
  "realm" : "SensorThings",
  "resource" : "SensorThingsService",
  "auth-server-url" : "http://<My Keycloak Service>:8081/auth",
  "ssl-required" : "none",
  "use-resource-role-mappings" : false,
  "enable-cors" : true,
  "cors-max-age" : 1000,
  "cors-allowed-methods" : "POST, PUT, DELETE, GET",
  "bearer-only" : false,
  "enable-basic-auth" : false,
  "expose-token" : true,
  "connection-pool-size" : 20,
  "disable-trust-manager": false,
  "allow-any-hostname" : false,
  "token-minimum-time-to-live" : 10,
  "min-time-between-jwks-requests" : 10,
  "public-key-cache-ttl": 86400
}

Does everything look ok in my keycloak.json?

I then checked the Keycloak's server log and for some reason the client id seems to be arriving as null everytime I make a request:

11:26:56,953 WARN  [org.keycloak.events] (default task-9) type=CODE_TO_TOKEN_ERROR, realmId=SensorThings, clientId=null, userId=null, ipAddress=<My IP address>, error=invalid_client_credentials, grant_type=authorization_code

I figure this is why tomcat is telling me "Client was not identified". But what could be causing this? Again, please excuse if this is a silly issue :)

My best regards

Unable to POST Things to the SensorThingsServer

Hi,
I followed the installation instructions and successfully built the war file and installed it in Apache Tomcat.
I can now access the API at http://localhost:8080/SensorThingsServer-1.0/ but I'm not able to insert data.

I'm using Postman to perform a POST to http://localhost:8080/SensorThingsServer-1.0/v1.0/Things with the following text body as application/json:

{
  "description": "Weather Station",
  "properties": {
    "name": "test a",
    "type": "abc"
  },
  "Locations": [
    {
      "encodingType": "application/vnd.geo+json",
      "description": "Weather Station",
      "location": {"coordinates":[-18.1002,65.6856],"type":"Point"}
    }
  ]
}

The Server responds with 403 Forbidden. What am I doing wrong?

Many thanks in advance,
Tobias

Location encodingType - GeoJSON "application/geo+json" media type support

Encoding type GeoJSON with "application/geo+json" media type not inserting geometry value in "LOCATIONS" table GEOM column . But media type "application/vnd.geo+json" is working fine.

I checked RFC specs ( https://tools.ietf.org/html/rfc7946 ) for GeoJSON, seems like media type "application/geo+json" is newer one. The vnd one is older and obsolete.

Example :
I have inserted two "Location" entity using different GeoJSON media Type.

Dolce&Gabbana - "application/geo+json"
Sephora - "application/vnd.geo+json"
Moncler - "application/vnd.geo+json"

geometry value not inserted for Dolce&Gabbana

image

why response is empty in ajax post?

i have a problem, when i post a things entity or others entity by jquery ajax, in success function my result is empty.

$.ajax({
url: "http://localhost:8099/SensorThingsService/v1.0/Things",
type: "POST",
data: json,
contentType: "application/json; charset=utf-8",
success: function(result){
alert(JSON.stringify(result));
},
error: function(response, status){
console.log(status);
}
});

line 230 " response.setResult((T) entity); " in service.java class fill exactly true but in client response not showing anything. what is problem?

Reduction of provided classes - V1.1 Request

As many STA installations only provide simple measurements and/or do not utilize HistoricalLocations, it could make usage easier by not presenting these to the user. This could be a configuration option, with servers only presenting those classes actually providing data

geo distance between two locations using longitude and latitude in meters

Hi,

We are trying to get location within 50 meters from given ‘Point’. I have two points from Google Map and stored this two points in STA LOCATION table. Distance between 2 points is 101.23 meter.

Point 1 : Dolce&Gabbana - 103.831709,1.304389 (longitude, latitude)

image

Point 2 : Sephora - 103.832342,1.30373 (longitude, latitude)

image

STA LOCATION Table data

image

SRID projection:
image

Issue :
I expect below API call return back location “Sephora” but its return empty JSON.

Request :
http://localhost:8080/SensorThingsService/v1.0/Locations?$filter=geo.distance(location,geography'POINT(103.831709 1.304389)') gt 50

Response :
{
"@iot.count": 0,
"value": []
}

What is the unit of geo.distance filter result?
Is SensorThing API has any functions to get geo distance in meters?

dataArray to MultiDatastream

Hello,

I'm trying to insert Observations using dataArrays on a MultiDatastream that has 3 ObservedProperties . However I'm not too sure how I should format my dataArray.

I was reading the standard format for Datastreams here: http://docs.opengeospatial.org/is/15-078r6/15-078r6.html#83 So I'm trying to follow the convention by building an object like this for MultiDatastreams:

[
  {
    "MultiDatastream": {
      "@iot.id": 2
    },
    "components": [
      "phenomenonTime",
      "result"
    ],
    "dataArray": [
      [
        "2018-04-04T10:20:00-0700",
        [6, 10.0, 64]
      ]
    ],
    "[email protected]": 1
  }
]

However this gives me "error Missing Datastream or MultiDatastream.".
The MultiDatastream is defined as follows:

{
    "name": "Wind",
    "description": "Wind is the perceptible natural movement of the air",
    "observationType": "http://www.opengis.net/def/observationType/OGC-OM/2.0/OM_ComplexObservation",
    "multiObservationDataTypes": [
        "http://www.opengis.net/def/observationType/OGC-OM/2.0/OM_Measurement",
        "http://www.opengis.net/def/observationType/OGC-OM/2.0/OM_Measurement",
        "http://www.opengis.net/def/observationType/OGC-OM/2.0/OM_Measurement"
    ],
    "unitOfMeasurements": [
        {
            "name": "deg",
            "symbol": "deg",
            "definition": "urn:ogc:def:uom:OGC:1.0:degree"
        },
        {
            "name": "m/s",
            "symbol": "m_s",
            "definition": "urn:ogc:def:uom:OGC::m_s"
        },
        {
            "name": "m/s",
            "symbol": "m_s",
            "definition": "urn:ogc:def:uom:OGC::m_s"
        }
    ]
}

It works fine for Datastreams, so I'm wondering if I'm formatting it correctly?

Thanks in advance.

Empty Response when POSTing new Entities

Hi,

when I POST a new entity (Thing/Location) it is created successfully. But the body of the server response is empty. Normally I would expect the response to contain the new entity, so I can get the @iot.id for further interaction with the new entity.

Example POST body:

{
  "name": "Temperature Monitoring System",
  "description": "Sensor system monitoring area temperature",
  "properties": {
    "Deployment Condition": "Deployed in a third floor balcony",
    "Case Used": "Radiation shield"
  }
}

Expected Response Body

{
  "@iot.id": 1130913,
  "@iot.selfLink": "http://scratchpad.sensorup.com/OGCSensorThings/v1.0/Things(1130913)",
  "description": "Sensor system monitoring area temperature",
  "name": "Temperature Monitoring System",
  "properties": {
    "Case Used": "Radiation shield",
    "Deployment Condition": "Deployed in a third floor balcony"
  },
  "[email protected]": "http://scratchpad.sensorup.com/OGCSensorThings/v1.0/Things(1130913)/Datastreams",
  "[email protected]": "http://scratchpad.sensorup.com/OGCSensorThings/v1.0/Things(1130913)/HistoricalLocations",
  "[email protected]": "http://scratchpad.sensorup.com/OGCSensorThings/v1.0/Things(1130913)/Locations"
}

Actual Response Body

Is there a configuration option I missed? What am I doing wrong?

Many thanks in advance,
Tobias

GIS Projection/SRID not specified for Locations and other tables

What is the unit of measurement in geo.distance filter result?

Ideally, public.geometry_columns table should have a srid (FK from public.spatial_ref_sys table) specified for the SensorThings API tables "Locations", "DataStreams","Multi_DataStreams" and "Features". However, the value is currently '0'. In the absence of appropriate projection, it leaves the result ambiguous.

Docker rebuild?

Hello,

I just try to play a bit with the docker rebuild of the repo, I ran the cmd in the root folder following the construction on README

mvn dockerfile:build -pl FROST-Server

[INFO] Scanning for projects...
[ERROR] [ERROR] Could not find the selected project in the reactor: FROST-Server @
[ERROR] Could not find the selected project in the reactor: FROST-Server -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MavenExecutionException

If I tried following cmd, the build did start but it failed on the ADD INSTRUCTION

mvn dockerfile:build -pl FROST-Server.HTTP
.....
[INFO] ---> Using cache
[INFO] ---> 374ffbd77c57
[INFO] Step 6/6 : ADD target/${WAR_FILE} /usr/local/tomcat/webapps/FROST-Server.war
[INFO]
[ERROR] ADD failed: stat /var/lib/docker/tmp/docker-builder086415024/target/FROST-Server.HTTP-1.6-SNAPSHOT.war: no such file or directory

What am I missing here? Any help would be really appreciated. Thanks in advance

Multiple Locations

Hello,

I try to connect several Locations to one Thing through HTTP, however, the second location keep overwrite the first one and make this Thing's Location remain one location.

Did anyone occur same problem or solve it?

Any help will be appreciated. Thanks a lot.

Set id manually

Dear SensorThings Team,

is it possible to set the id of an object manually at it's creation?
So for example, can i create a Thing with @iot.id': "Some unique identifier"?
Best Regards

Johannes Riesterer

Extended Properties - V1.1 Request

In order to provide fuller information, adding a properties element as in the Thing class would be most valuable. Through such an extension, we can even reach INSPIRE compliance :)
The properties extension would be required for the following classes:

  • Datastream
  • ObservedProperty

To my understanding, in Sensors the metadata element serves this purpose. What I'm not quite clear on is if the feature element of the FeatureOfInterest or the location element of Location can also be utilized for such additional information

Is it possible to get the average of observation results within a time interval?

I'm looking for a query option to aggregate the result values of multiple observations within a time intervall. For example I would like to get the average observation result value of all observations from a specific day.

I would think of a query to look something like that:
http://localhost:8080/SensorThingsServer-1.0/v1.0
/Datastreams(123)
/Observations
?$select=avg( result )
&$filter=during( phenomenonTime, 2017-10-25T15:00:00.000Z/2017-10-26T15:00:00.000Z )

Is it possible to do something similar to that?

Many thanks in advance,
Tobias

System time in Historical Locations

The entity 'HistoricalLocations' automatically stores system time. According with requirement 8 of the standard, when a new 'Location' is created a historical location shall be automatically created. This great. However, it offers some disadvantages:

  1. The time assigned to the to a historical location is a system time (or database time). This generates inconsistencies between the time a new location is registered into the API and the 'actual' time in which the change of location was reported to the API. Specially important in networks with a big latency.
  2. Drawbacks on handling moving Things (sensors attached to objects that change location very fast: cars, airplanes, people, etc). Again, the latency of the network may add errors between the location of a Thing and the time in which it reach the API. More importantly, this means that the 'resulTime' in the Observation entity won't match the 'time' in HistoricalLocation. As a result, one can request the 'result' of an observation for time 'X', but one cannot be sure what was the precise location of the Thing (producing the observation) at that time, and vice-versa.

Any thoughts on how to overcome those disadvantages?
My take on this is that the entity model of the standard need changes.

Docker image to support ARM-based

I am trying to install the FROST-Server on a raspberry pi running Raspbian and would like to know if there are any plans to also have a docker image for ARM.

Thanks

Do we have support to filter properties from Location GeoJSON with type 'Feature'

Is STA has support to filter properties of 'location' with type 'Feature'. Below is our sample Location data with type 'Feature'

Sample Location :
{
"name": "TestLocation",
"encodingType": "application/geo+json",
"description": "TestLocation",
"location": {
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [103.83170798,1.30438905]
},
"properties": {
"priority": "3"
}
}
}

I am trying to filter Location by properties 'priority'.

Request :
http://localhost:8080/SensorThingsService/v1.0/Locations?$select=location&$filter=location/properties/priority eq '3'

Error Message :
Failed to parse because (Set loglevel to trace for stack): Entity property location does not have custom properties (properties).

Search alternative for QueryDSL

It seems QueryDSL is not actively maintained. The last release is over a year old, and the last commit is 11 months old.
While implementing a backend with UUID IDs, @selimnairb noticed that QueryDSL does not support UUIDs well yet. It doesn't have to be nearly as complex as QueryDSL, since we don't need the JPA/JDO bits, and we can also do without the code generation bits.

Therefore we should look for an alternative to QueryDSL. If anyone has suggestions, please share.

One option would be jOOQ: https://github.com/jOOQ/jOOQ

Connection via websockest with MQTT.js

Hi, I try to connect with MQTT.js package via websockets

My code:
var mqtt = require("mqtt"),
options = {
port: 9876,
keepalive: 60,
encoding: "utf8",
protocol: "ws",
},
client = mqtt.connect("http://mySensorThingsIP", options);

With the protocol: "mqtt" and the port: 1883 I get a connection, but it does not work over websockets. What's my mistake?

I need the connection via Websockets because I want to connect with extension-package: "browserMqtt" for MQTT.js directly in browser to the SensorThings-Server. This works only with a websocket-connection and not with a direct connection via protocol: "mqtt and port: 1883.

I tried the follow possibilities:

var client = mqtt.connect("ws://mySensorThingsIP:9876/mqtt");
var client = mqtt.connect({host: "mySensorThingsIP", port : 9876, path : "/mqtt"});
var client = mqtt.connect({host: "mySensorThingsIP", port : 9876});

but i get this error message for all variants:
WebSocket connection to 'ws://mySensorThingsIP:9876/mqtt' failed: Error in connection establishment: net::ERR_CONNECTION_TIMED_OUT

Filter with traditional Chinese parameter doesn't work

I have some columns in traditional Chinese and I am trying to filter with it
Original Data in Table:
ashampoo_snap_2018 5 11 16h32m35s_001

Below is my url:
http://localhost:8080/STA/v1.0/Datastreams?$filter=name eq '水位高度'
Result:
Invalid query: Query is not valid: Lexical error at line 1, column 18. Encountered: "\u6c34" (27700), after : "'"

It seems that traditional Chinese character will be encoded to unicode by Explorer.
How do I deal with it?
Any suggestion will be appreciated. Thanks a lot.

Wildcards not allowed

You mentioned that wildcards are not allowed when using MQTT. However, I thought that wildcards would be allowed following the ISO MQTT specification. Is this planned in future implementations?

Swagger.io API documentation

Do you have the API documented, e.g. by swagger ?
This means I can generate clients for the OGC SensorThings API and use your component very easily. I have been working on this, but my version is by no means complete.

Support building on Java 9

Hi,

I'm unable to build the 1.3 version (and the SNAPSHOT version as well). Indeed, there are test failures during the Maven installation:

screen shot 2018-01-26 at 11 51 21

Steps to reproduce are the same as those explained from the README:

  1. Clone repository
  2. Run git submodule init
  3. Run git submodule update
  4. Run mvn [clean] install

Aurélien

Issue by getting Observation from MQTT broker

Hi,
I posted a DataStream and several Observations via Rest to the Frost-Server.

Then i tried to get these Observations via MQTT broker.

The received JSON is as follows:
{ "phenomenonTime" : "2016-03-02T10:50:00.000Z", "resultTime" : "2016-03-02T10:50:00.000Z", "result" : "7.2", "[email protected]" : "http://localhost:8080/FROST-Server/v1.0/Observations('1f067586-604e-11e8-b7bf-c3302333d15c')/Datastream", "Datastream" : { "unitOfMeasurement" : { "name" : null, "symbol" : null, "definition" : null }, "@iot.id" : "t1edu.teco.wang/IBADENWR82/Datastreams/Air-Temperature-IBADENWR82" }, "[email protected]" : "http://localhost:8080/FROST-Server/v1.0/Observations('1f067586-604e-11e8-b7bf-c3302333d15c')/FeatureOfInterest", "FeatureOfInterest" : { "@iot.id" : "t1edu.teco.wang/IBADENWR82/FoI/IBADENWR82/48.441/9.257" }, "@iot.id" : "1f067586-604e-11e8-b7bf-c3302333d15c", "@iot.selfLink" : "http://localhost:8080/FROST-Server/v1.0/Observations('1f067586-604e-11e8-b7bf-c3302333d15c')"},

which seems not correct:

  1. unitOfMeasurement in DataStream entity is not returned correctly. I checked this via Rest, it's inideed saved in Frost-Server, but not sent via MQTT.
  2. Maybe the entity FeatureOfInterest of an Observation should also be completely packed in JSON? Just like its 'DataStream'
  3. Maybe the 'ids' and 'selfLinks' should not be returned? It makes little sense in the message.

Thanks a lot in advance!

Cannot change PersistenceManager

In my context.xml I have

<Parameter override="false" name="persistence.persistenceManagerImplementationClass" value="de.fraunhofer.iosb.ilt.sta.persistence.postgres.stringid.PostgresPersistenceManagerString" description="The java class used for persistence (must implement PersistenceManaher interface)"/>

However, when running FROST-Server, e.g. in debug mode in Netbeans, it defaults to

de.fraunhofer.iosb.ilt.sta.persistence.postgres.longid.PostgresPersistenceManagerLong

This is propably due to the changes in Settings.java from this commit, where in line 44 we now have:

wrapper.setProperty("persistence.persistenceManagerImplementationClass", "de.fraunhofer.iosb.ilt.sta.persistence.postgres.longid.PostgresPersistenceManagerLong");

Is there a new way to set/override the PersistenceManager?

Receiving messages twice when more than 1 MQTT client are connected

Hi,

if I connect to the SensorServer with more than 1 MQTT client (i.e. from 2 different computers). I receive a published message twice on each client. As soon as one client is disconnected and only one is left, I receive the expected one message.

I publish on "v1.0/Observations" using MQTT a JSON Object:

{
"result" : 21.6,
"Datastream":{"@iot.id":1}
}

and the clients are subscribed to "v1.0/Datastreams(1)/Observations"

using QoS 0 I would expect that the message is sent only once (fire and forget) or does the problem occure since the SensorThings API Server subcribes to topics "topicFilter='#'" using QoS=1?

my log file looks like:

2017-04-26 17:31:41,393 262 [localhost-startStop-1] INFO d.f.iosb.ilt.sta.ContextListener - Context initialised, loading settings.
2017-04-26 17:31:41,730 599 [localhost-startStop-1] INFO io.moquette.server.Server - Persistent store file: C:\instances\TC01\work\Catalina\localhost\SensorThingsService\moquette_store.mapdb
2017-04-26 17:31:41,750 619 [localhost-startStop-1] INFO i.m.s.p.MapDBPersistentStore - Starting with existing [C:\instances\TC01\work\Catalina\localhost\SensorThingsService\moquette_store.mapdb] db file
2017-04-26 17:31:42,010 879 [localhost-startStop-1] INFO i.m.s.i.ProtocolProcessorBootstrapper - Starting without ACL definition
2017-04-26 17:31:42,556 1425 [localhost-startStop-1] INFO i.m.server.netty.NettyAcceptor - Server binded host: xxx.xxx.xxx.xxx, port: 1883
2017-04-26 17:31:42,563 1432 [localhost-startStop-1] INFO i.m.server.netty.NettyAcceptor - Server binded host: xxx.xxx.xxx.xxx, port: 9876
2017-04-26 17:31:42,615 1484 [localhost-startStop-1] INFO d.f.i.i.s.m.m.MoquetteMqttServer - paho-client connecting to broker: tcp://xxx.xxx.xxx.xxx:1883
2017-04-26 17:31:42,726 1595 [nioEventLoopGroup-3-1] INFO messageLogger - C->B CONNECT client <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)>
2017-04-26 17:31:42,727 1596 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type CONNECT
2017-04-26 17:31:42,728 1597 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - CONNECT for client <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)>
2017-04-26 17:31:42,758 1627 [nioEventLoopGroup-3-1] INFO io.moquette.spi.ClientSession - cleaning old saved subscriptions for client <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)>
2017-04-26 17:31:42,767 1636 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - Connection established
2017-04-26 17:31:42,768 1637 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - CONNECT processed
2017-04-26 17:31:42,769 1638 [nioEventLoopGroup-3-1] INFO messageLogger - C->B SUBSCRIBE <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics [[qos=1, topicFilter='#']]
2017-04-26 17:31:42,769 1638 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type SUBSCRIBE
2017-04-26 17:31:42,770 1639 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - SUBSCRIBE client <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)>
2017-04-26 17:31:42,819 1688 [nioEventLoopGroup-3-1] INFO io.moquette.spi.ClientSession - <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> subscribed to the topic filter <#> with QoS 1 - LEAST_ONE
2017-04-26 17:31:42,832 1701 [localhost-startStop-1] INFO d.f.i.i.s.m.m.MoquetteMqttServer - paho-client connected to broker

Connect with first Client (receiving one message):

2017-04-26 17:32:30,750 49619 [nioEventLoopGroup-3-2] INFO messageLogger - C->B CONNECT client <MQTT_FX_Client>
2017-04-26 17:32:30,751 49620 [nioEventLoopGroup-3-2] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type CONNECT
2017-04-26 17:32:30,751 49620 [nioEventLoopGroup-3-2] INFO i.m.spi.impl.ProtocolProcessor - CONNECT for client <MQTT_FX_Client>
2017-04-26 17:32:30,752 49621 [nioEventLoopGroup-3-2] INFO io.moquette.spi.ClientSession - cleaning old saved subscriptions for client <MQTT_FX_Client>
2017-04-26 17:32:30,757 49626 [nioEventLoopGroup-3-2] INFO i.m.spi.impl.ProtocolProcessor - Connection established
2017-04-26 17:32:30,758 49627 [nioEventLoopGroup-3-2] INFO i.m.spi.impl.ProtocolProcessor - CONNECT processed
2017-04-26 17:32:32,071 50940 [nioEventLoopGroup-3-2] INFO messageLogger - C->B SUBSCRIBE <MQTT_FX_Client> to topics [[qos=0, topicFilter='v1.0/Datastreams(1)/Observations']]
2017-04-26 17:32:32,072 50941 [nioEventLoopGroup-3-2] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type SUBSCRIBE
2017-04-26 17:32:32,072 50941 [nioEventLoopGroup-3-2] INFO i.m.spi.impl.ProtocolProcessor - SUBSCRIBE client <MQTT_FX_Client>
2017-04-26 17:32:32,073 50942 [nioEventLoopGroup-3-2] INFO io.moquette.spi.ClientSession - <MQTT_FX_Client> subscribed to the topic filter <v1.0/Datastreams(1)/Observations> with QoS 0 - MOST_ONE
2017-04-26 17:32:38,803 57672 [nioEventLoopGroup-3-1] INFO messageLogger - C->B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:32:38,803 57672 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PUBLISH
2017-04-26 17:32:38,804 57673 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - PUB --PUBLISH--> SRV executePublish invoked with io.moquette.parser.proto.messages.PublishMessage@54ee28ff
2017-04-26 17:32:38,808 57677 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <MQTT_FX_Client> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:32:38,810 57679 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:32:38,810 57679 [nioEventLoopGroup-3-2] INFO messageLogger - C<-B PUBLISH <MQTT_FX_Client> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:32:38,811 57680 [nioEventLoopGroup-3-1] INFO messageLogger - C<-B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:33:32,071 110940 [nioEventLoopGroup-3-2] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PINGREQ
2017-04-26 17:33:38,803 117672 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PINGREQ

Connect with second client and now receiving two messages:

2017-04-26 17:34:09,652 148521 [nioEventLoopGroup-3-3] INFO messageLogger - C->B CONNECT client
2017-04-26 17:34:09,653 148522 [nioEventLoopGroup-3-3] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type CONNECT
2017-04-26 17:34:09,653 148522 [nioEventLoopGroup-3-3] INFO i.m.spi.impl.ProtocolProcessor - CONNECT for client
2017-04-26 17:34:09,655 148524 [nioEventLoopGroup-3-3] INFO io.moquette.spi.ClientSession - cleaning old saved subscriptions for client
2017-04-26 17:34:09,659 148528 [nioEventLoopGroup-3-3] INFO i.m.spi.impl.ProtocolProcessor - Connection established
2017-04-26 17:34:09,660 148529 [nioEventLoopGroup-3-3] INFO i.m.spi.impl.ProtocolProcessor - CONNECT processed
2017-04-26 17:34:10,994 149863 [nioEventLoopGroup-3-3] INFO messageLogger - C->B SUBSCRIBE to topics [[qos=0, topicFilter='v1.0/Datastreams(1)/Observations']]
2017-04-26 17:34:10,994 149863 [nioEventLoopGroup-3-3] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type SUBSCRIBE
2017-04-26 17:34:10,995 149864 [nioEventLoopGroup-3-3] INFO i.m.spi.impl.ProtocolProcessor - SUBSCRIBE client
2017-04-26 17:34:10,995 149864 [nioEventLoopGroup-3-3] INFO io.moquette.spi.ClientSession - subscribed to the topic filter <v1.0/Datastreams(1)/Observations> with QoS 0 - MOST_ONE
2017-04-26 17:34:15,491 154360 [nioEventLoopGroup-3-1] INFO messageLogger - C->B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,491 154360 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PUBLISH
2017-04-26 17:34:15,491 154360 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - PUB --PUBLISH--> SRV executePublish invoked with io.moquette.parser.proto.messages.PublishMessage@29f37ac3
2017-04-26 17:34:15,492 154361 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,492 154361 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <MQTT_FX_Client> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,492 154361 [nioEventLoopGroup-3-3] INFO messageLogger - C<-B PUBLISH to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,492 154361 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,492 154361 [nioEventLoopGroup-3-2] INFO messageLogger - C<-B PUBLISH <MQTT_FX_Client> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,492 154361 [nioEventLoopGroup-3-1] INFO messageLogger - C<-B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,497 154366 [nioEventLoopGroup-3-1] INFO messageLogger - C->B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,497 154366 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PUBLISH
2017-04-26 17:34:15,497 154366 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - PUB --PUBLISH--> SRV executePublish invoked with io.moquette.parser.proto.messages.PublishMessage@6229e87b
2017-04-26 17:34:15,497 154366 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,497 154366 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <MQTT_FX_Client> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,498 154367 [nioEventLoopGroup-3-3] INFO messageLogger - C<-B PUBLISH to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,498 154367 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,498 154367 [nioEventLoopGroup-3-2] INFO messageLogger - C<-B PUBLISH <MQTT_FX_Client> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:34:15,498 154367 [nioEventLoopGroup-3-1] INFO messageLogger - C<-B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>

Here I disconnect one client and publish one message afterwards:

2017-04-26 17:43:13,651 692520 [nioEventLoopGroup-3-3] INFO messageLogger - C->B DISCONNECT
2017-04-26 17:43:13,652 692521 [nioEventLoopGroup-3-3] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type DISCONNECT
2017-04-26 17:43:13,652 692521 [nioEventLoopGroup-3-3] INFO i.m.spi.impl.ProtocolProcessor - cleaning old saved subscriptions for client
2017-04-26 17:43:13,678 692547 [nioEventLoopGroup-3-3] INFO i.m.spi.impl.ProtocolProcessor - DISCONNECT client finished
2017-04-26 17:43:13,678 692547 [nioEventLoopGroup-3-3] INFO messageLogger - Channel closed
2017-04-26 17:43:15,506 694375 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PINGREQ
2017-04-26 17:43:27,423 706292 [nioEventLoopGroup-3-1] INFO messageLogger - C->B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:43:27,424 706293 [nioEventLoopGroup-3-1] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PUBLISH
2017-04-26 17:43:27,424 706293 [nioEventLoopGroup-3-1] INFO i.m.spi.impl.ProtocolProcessor - PUB --PUBLISH--> SRV executePublish invoked with io.moquette.parser.proto.messages.PublishMessage@170df951
2017-04-26 17:43:27,427 706296 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <MQTT_FX_Client> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:43:27,428 706297 [nioEventLoopGroup-3-1] INFO i.m.s.i.PersistentQueueMessageSender - send publish message to <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> on topic <v1.0/Datastreams(1)/Observations>
2017-04-26 17:43:27,428 706297 [nioEventLoopGroup-3-2] INFO messageLogger - C<-B PUBLISH <MQTT_FX_Client> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:43:27,428 706297 [nioEventLoopGroup-3-1] INFO messageLogger - C<-B PUBLISH <SensorThings API Server (743cf4fd-d470-4cab-a3a9-0ffce6bd76b4)> to topics <v1.0/Datastreams(1)/Observations>
2017-04-26 17:43:32,080 710949 [nioEventLoopGroup-3-2] INFO i.m.server.netty.NettyMQTTHandler - Received a message of type PINGREQ

Another issue are wildcards:

2017-04-26 10:33:05,316 19914433 [nioEventLoopGroup-3-4] INFO io.moquette.spi.ClientSession - <paho/28A7D875E8DC786DAD> subscribed to the topic filter <v1.0/#> with QoS 0 - MOST_ONE
2017-04-26 10:33:05,333 19914450 [pool-2-thread-1] ERROR d.f.i.i.sta.parser.path.PathParser - Failed to parse because (Set loglevel to trace for stack): Lexical error at line 1, column 3. Encountered: after : ""
2017-04-26 10:33:05,335 19914452 [pool-2-thread-1] ERROR d.f.i.i.s.m.s.Subscription - Not a valid path: Path is not valid.

This should normally work:

2017-04-26 17:01:34,424 43223541 [nioEventLoopGroup-3-3] INFO io.moquette.spi.ClientSession - subscribed to the topic filter <v1.0/+/Observations> with QoS 0 - MOST_ONE
2017-04-26 17:01:34,430 43223547 [pool-2-thread-12] ERROR d.f.i.i.sta.parser.path.PathParser - Failed to parse because (Set loglevel to trace for stack): Lexical error at line 1, column 2. Encountered: " " (32), after : ""
2017-04-26 17:01:34,433 43223550 [pool-2-thread-12] ERROR d.f.i.i.s.m.s.Subscription - Not a valid path: Path is not valid.

Thanks in advance

Comparision Filter issue for Observations: $filter=fieldName lt, does not return correct value

This seems odd, I have created several observations for one datastream,

Observation that are present for datastream(1)

  • water temperature at depth 5.5
{
  "phenomenonTime": "2002-11-30T04:23:36Z",
  "resultTime": "2002-11-30T04:23:36Z",
  "result": 5.955,
  "resultQuality": 1,
  "parameters": {
	"positionIndicatorOfArgo": 3,
	"cycleNumber": 71,
	"dataMode": "D",
	"depth": 5.5,
	"depthFlag": 1,
	"depthAdj": 2.2,
	"depthAdjFlag": 1
  },
  "Datastream":{"@iot.id": 1}
}
  • water temperature at depth 9.4
{
  "phenomenonTime": "2002-11-30T04:23:36Z",
  "resultTime": "2002-11-30T04:23:36Z",
  "result": 5.931,
  "resultQuality": 1,
  "parameters": {
	"positionIndicatorOfArgo": 3,
	"cycleNumber": 71,
	"dataMode": "D",
	"depth": 9.4,
	"depthFlag": 1,
	"depthAdj": 6.1,
	"depthAdjFlag": 1
  },
  "Datastream":{"@iot.id": 1}
}

- Queries that are working:

  1. /SensorThingsService/v1.0/Datastreams(1)/Observations?$filter=parameters/depth lt 6
    Expectation : Should return the 1st obs
    Result : Returning the 1st obs

  2. /SensorThingsService/v1.0/Datastreams(1)/Observations?$filter=parameters/depth gt 9
    Expectation : Should return the 2nd obs
    Result : Returning the 2nd obs

- Queries that are not working properly:

  1. /SensorThingsService/v1.0/Datastreams(1)/Observations?$filter=parameters/depth lt 10
    Expectation : Should return both the observations
    Result : Returning none

  2. /SensorThingsService/v1.0/Datastreams(1)/Observations?$filter=parameters/depth gt 10
    Expectation : Should return none
    Result : Returning all the observations

NOTE: This is not specific to the numbers 6 or 10 only observations are not reliable for any comparison

I guess this is not only the problem with this implementation of SensorThings API, I found another implementation that has the same problem

Try this link

here Observations returned contains observations having field result > 2 also, while I have applied filter to restrict result < 2

Do we have support for Location data with altitude?

I tried inserting Location with altitude, but I can't find any difference in geometry data inserted in "Location" table (Ignores altitude).

Example:

Location JSON without altitude :
{
"name": "IONOrchard-without-Alt",
"encodingType": "application/geo+json",
"description": "IONOrchard-without-Alt",
"location": {
"type": "Polygon",
"coordinates": [
[
[103.83118689, 1.30408219],
[103.83193523, 1.30451928],
[103.83253068, 1.30403392],
[103.83200765, 1.30352175],
[103.83118689, 1.30408219]
]
]
}
}

GEOM value :

0103000020110F00000100000005000000233C4C5BC40B664129C65F68F2B8014140620DC5CE0B66419D8EFDC277BA0141B2AA2D0ED70B6641C013D168C7B801412A4D07C7CF0B6641714DE02CFFB60141233C4C5BC40B664129C65F68F2B80141

Location JSON with :
{
"name": "IONOrchard",
"encodingType": "application/geo+json",
"description": "IONOrchard",
"location": {
"type": "Polygon",
"coordinates": [
[
[103.83118689, 1.30408219, 26],
[103.83193523, 1.30451928, 22],
[103.83253068, 1.30403392, 22],
[103.83200765, 1.30352175, 24],
[103.83118689, 1.30408219, 26]
]
]
}
}

GEOM value :

0103000020110F00000100000005000000233C4C5BC40B664129C65F68F2B8014140620DC5CE0B66419D8EFDC277BA0141B2AA2D0ED70B6641C013D168C7B801412A4D07C7CF0B6641714DE02CFFB60141233C4C5BC40B664129C65F68F2B80141

Filtering issue for Things: $filter=properties/fieldname eq value does not work

This is probably a trivial question, but I need a bit of help to get further.

I have created several Things on the Server that one of my colleges have built and set up. One of the properties I have added for the Things is the field named "template".

An examples Thing has the following JSON content:

{
"name": "My Thing",
"description": "A Test Thing",
"properties": {
"TestModel": "XXX-v1.0",
"name": "My models name",
"type": "type of smart sensor",
"supplier": "Blenda Corp",
"template": "true"
},
"[email protected]": "http://sthing-app.falster.io:8080/SensorThingsServer-1.0/v1.0/Things(9)/Locations",
"[email protected]": "http://sthing-app.falster.io:8080/SensorThingsServer-1.0/v1.0/Things(9)/HistoricalLocations",
"[email protected]": "http://sthing-app.falster.io:8080/SensorThingsServer-1.0/v1.0/Things(9)/Datastreams",
"[email protected]": "http://sthing-app.falster.io:8080/SensorThingsServer-1.0/v1.0/Things(9)/MultiDatastreams",
"@iot.id": 9,
"@iot.selfLink": "http://sthing-app.falster.io:8080/SensorThingsServer-1.0/v1.0/Things(9)"

I now want to filter my request for Things, so I only get those where the property template is true.

From reading OData documentation I expect to be able to GET the following URL:

http://sthing-app.falster.io:8080/SensorThingsServer-1.0/v1.0/Things(9)?$filter=properties/template eq 'true'

However, the response I get from the SensorThings API server is
"Unexpected 'I'"

Am I doing something wrong?
Is there some configuration lacking? (filtering works fine on top level fields like name)
Is this a bug?

Hoping somebody can help here. Thanks!

Representation of 'unknown' location using GeoJSON

We need to define "unknown" location in "Location" table. Currently we are representing like below

Sample 'unknown' Location JSON:
{
"name": "unknown",
"encodingType": "application/geo+json",
"description": "unknown",
"location": {
"type":"Polygon",
"coordinates": [
[
[0,0],
[0,0],
[0,0],
[0,0],
[0,0]
]
]
}
}

I've thought about using an empty exterior ring:
{
"type": "Polygon",
"coordinates": [
[]
]
}
But I am getting below error from STA

Invalid geoJson: Invalid Polygon: Empty linear ring is not valid.

Any suggestions for standard way to specify unknown location?

Filter on [email protected]

using Datastream?$expand=Observations I would like to orderby or filter by the Number of Observations in each datastream. The goal is to select the datastreams with highest number of observations. Alltough there is an key "[email protected]" the following approach does not work:
/v1.0/Datastreams?$expand=Observations&$filter=[email protected] gt 5

v1.0/Datastreams?$expand=Observations&$orderby=[email protected]

any suggestions?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.