Code Monkey home page Code Monkey logo

linked-connections-server's Introduction

Linked Connections Server

Coverage Status

Express based Web Server that exposes Linked Connections data fragments using JSON-LD serialization format. It also provides a built-in tool to parse GTFS and GTFS Realtime transport dataset feeds into a Linked Connections Directed Acyclic Graph using GTFS2LC and fragment it following a configurable predefined size.

Installation

First make sure to have Node 11.7.x or superior installed. To install the server proceed as follows:

git clone https://github.com/julianrojas87/linked-connections-server.git
cd linked-connections-server
npm install

Configuration

The configuration is made through two different config files. One is for defining Web Server parameters (server_config.json) and the other is for defining the different data sources that will be managed and exposed through the Linked Connections Server (datasets_config.json). Next you could find an example and a description of each config file.

Web Server configuration

As mentioned above the Web server configuration is made using the (server_config.json) config file which uses the JSON format and defines the following properties:

  • hostname: Used to define the Web Server host name. Is a mandatory parameter.

  • port: TCP/IP port to be used by the Web Server to receive requests. Is a mandatory parameter.

  • protocol: Used to define the accepted protocol by the Web Server which could be either HTTP o HTTPS. In case that both protocols are supported there is no need to define this parameter, but all requests made to the server MUST contain the X-Forwarded-Proto header stating the procotol being used. This is useful when the server is used along with cache management servers.

  • logLevel: Used to define the logging level of the server. We use the Winston library to manage logs. If not specified, the default level is info.

This is a configuration example:

{
    "hostname": "localhost:3000",
    "port": 3000,
    "protocol": "http" // or https
    "logLevel": "info" //error, warn, info, verbose, debug, silly
}

Datasets configuration

The Web Server does not provide any functionality by itself, it needs at least one dataset (in GTFS format) that can be downloaded to be processed and exposed as Linked Connections. To tell the server where to find and store such datasets, we use the (datasets_config.json) config file. All the parameters in this config file are Mandatory, otherwise the server won't function properly. This file contains the following parameters:

  • storage: This is the path that tells the server where to store and where to look for the data fragments, created from the different datasets. This should not include a trailing slash. Make sure you have enough disk space to store and process datasets.

  • sortMemory: Max amount of RAM memory that can be used by the Linked Connections sorting process. Default is 2G.

  • organization: URI and name of the data publisher.

  • keywords: Related keywords for a given dataset. E.g., types of vehicles available.

  • companyName: Name of the transport company that provides the GTFS dataset feed.

  • geographicArea: GeoNames URI that represents the geographic area served by the public transport provider.

  • downloadUrl: URL where the GTFS dataset feed can be downloaded.

  • downloadOnLaunch: Boolean parameter that indicates if the GTFS feed is to be downloaded and processed upon server launch.

  • updatePeriod: Cron expression that defines how often should the server look for and process a new version of the dataset. We use the node-cron library for this.

  • fragmentSize: Defines the maximum number of connections per data fragment.

  • realTimeData: If available, here we define all the parameters related with a GTFS-RT feed.

    • downloadUrl: Here we define the URL to download the GTFS-RT data feed.
    • headers: Some GTFS-RT feeds require API keys to be accessed.
    • updatePeriod: Cron expression that defines how often should the server look for and process a new version of the dataset. We use the node-cron library for this.
    • fragmentTimeSpan: This defines the fragmentation of real-time data. It represents the time span of every fragment in seconds.
    • compressionPeriod: Cron expression that defines how often will the real-time data be compressed using gzip in order to reduce storage consumption.
    • indexStore: Indicates where the required static indexes (routes, trips, stops and stop_times) will be stored while processing GTFS-RT updates. MemStore for RAM and KeyvStore for disk.
    • deduce: If the GTFS-RT feed does not provide a explicit tripId for every update, set this parameter to true, so they can be identified using additional GTFS indexes.
  • baseURIs: Here we define the URI templates that will be used to create the unique identifiers of each of the entities found in the Linked Connections. Is necessary to define URIs for Connections, Stops, Trips and Routes. This is the only optional parameter and in case that is not defined, all base URIs will have a http://example.org/ pattern, but we recommend to always use dereferenceable URIs. Follow the RFC 6570 specification to define your URIs using the column names of the routes and trips GTFS source files. See an example next.

{
    "storage": "/opt/linked-connections-data", //datasets storage path
    "sortMemory": "4G",
    "organization": {
        "id": "https://...",
        "name": "Organization name"
    },
    "datasets":[
        {
            "companyName": "companyX",
            "keywords": ["Keyword1", "Keyword2"],
            "geographicArea": "http://sws.geonames.org/...", // Geo names URI
            "downloadUrl": "https://...",
            "downloadOnLaunch": false,
            "updatePeriod": "0 0 3 * * *", //every day at 3 am
            "fragmentSize": 1000, // 1000 connections/fragment
            "realTimeData": {
                "downloadUrl": "https://...",
                "headers": { "apiKeyHttpHeader": "my_api_key" },
                "updatePeriod": "*/30 * * * * *", //every 30s
                "fragmentTimeSpan": 600, // 600 seconds
                "compressionPeriod": "0 0 3 * * *", // Every day at 3 am
                "indexStore": "MemStore", // MemStore for RAM and LevelStore for disk processing
                "deduce": true // Set true only if the GTFS-RT feed does not provide tripIds
            },
            "baseURIs": {
                "stop": "http://example.org/stops/{stop_id}",
                "route": "http://example.org/routes/{routes.route_id}",
                "trip": "http://example.org/trips/{routes.route_id}/{trips.startTime(yyyyMMdd)}",
                "connection:" 'http://example.org/connections/{routes.route_id}/{trips.startTime(yyyyMMdd)}{connection.departureStop}'
            }
        },
        {
            "companyName": "companyY",
            "keywords": ["Keyword1", "Keyword2"],
            "geographicArea": "http://sws.geonames.org/...", // Geo names URI
            "downloadUrl": "http://...",
            "downloadOnLaunch": false,
            "updatePeriod": "0 0 3 * * *", //every day at 3am
            "baseURIs": {
                "stop": "http://example.org/stops/{stop_id}",
                "route": "http://example.org/routes/{routes.route_id}",
                "trip": "http://example.org/trips/{routes.route_id}/{trips.startTime(yyyyMMdd)}",
                "connection:" 'http://example.org/connections/{routes.route_id}/{trips.startTime(yyyyMMdd)}{connection.departureStop}'
            }
        }
    ]
}

Note that for defining the URI templates you can use the entity connection which consists of a departureStop, departureTime, arrivalStop and an arrivalTime. We have also noticed that using the start time of a trip (trip.startTime) is also a good practice to uniquely identify trips or even connections. If using any of the times variables you can define a specific format (see here) as shown in the previous example.

Run it

Once you have properly configured the server you can run the data fetching and the Web server separately:

cd linked-connections-server
node bin/datasets # Data fetching
node bin/web-server # Linked Connections Web server

Use it

To use it make sure you already have at least one fully processed dataset (the logs will tell you when). If so you can query the Linked Connections using the departure time as a parameter like this for example:

http://localhost:3000/companyX/connections?departureTime=2017-08-11T16:45:00.000Z

If available, the server will redirect you to the Linked Connections fragment that contains connections with departure times as close as possible to the one requested.

The server also publishes the stops and routes of every defined GTFS datasource:

http://localhost:3000/companyX/stops
http://localhost:3000/companyX/routes

A DCAT catalog describing all datasets of a certain company can be obtained like this:

http://localhost:3000/companyX/catalog

Historic Data

The server also allows querying historic data by means of the Memento Framework which enables time-based content negotiation over HTTP. By using the Accept-Datetime header a client can request the state of a resource at a given moment. If existing, the server will respond with a 302 Found containing the URI of the stored version of such resource. For example:

curl -v -L -H "Accept-Datetime: 2017-10-06T13:00:00.000Z" http://localhost:3000/companyX/connections?departureTime=2017-10-06T15:50:00.000Z

> GET /companyX/connections?departureTime=2017-10-06T15:50:00.000Z HTTP/1.1
> Host: localhost:3000
> User-Agent: curl/7.52.1
> Accept: */*
> Accept-Datetime: 2017-10-06T13:00:00.000Z

< HTTP/1.1 302 Found
< X-Powered-By: Express
< Access-Control-Allow-Origin: *
< Location: /memento/companyX?version=2017-10-28T03:07:47.000Z&departureTime=2017-10-06T15:50:00.000Z
< Vary: Accept-Encoding, Accept-Datetime
< Link: <http://localhost:3000/companyX/connections?departureTime=2017-10-06T15:50:00.000Z>; rel="original timegate"
< Date: Mon, 13 Nov 2017 15:00:36 GMT
< Connection: keep-alive
< Content-Length: 0

> GET /memento/companyX?version=2017-10-28T03:07:47.000Z&departureTime=2017-10-06T15:50:00.000Z HTTP/1.1
> Host: localhost:3000
> User-Agent: curl/7.52.1
> Accept: */*
> Accept-Datetime: 2017-10-06T13:00:00.000Z

< HTTP/1.1 200 OK
< X-Powered-By: Express
< Memento-Datetime: Fri, 06 Oct 2017 13:00:00 GMT
< Link: <http://localhost:3000/companyX/connections?departureTime=2017-10-06T15:50:00.000Z>; rel="original timegate"
< Access-Control-Allow-Origin: *
< Content-Type: application/ld+json; charset=utf-8
< Content-Length: 289915
< ETag: W/"46c7b-TOdDIcDjCvUXTC/gzqr5hxVDZjg"
< Date: Mon, 13 Nov 2017 15:00:36 GMT
< Connection: keep-alive

The previous example shows a request made to obtain the Connections fragment identified by the URL http://localhost:3000/companyX/connections?departureTime=2017-10-06T15:50:00.000Z, but specifically the state of this fragment as it was at Accept-Datetime: 2017-10-06T13:00:00.000Z. This means that is possible to know what was the state of the delays at 13:00 for the departures at 15:50 on 2017-10-06.

Authors

Julian Rojas - [email protected]
Pieter Colpaert - [email protected]

linked-connections-server's People

Contributors

arnotroch avatar bertware avatar dylanvanassche avatar greenkeeper[bot] avatar julianrojas87 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

linked-connections-server's Issues

graph.irail.be/sncb/stops does not have translated names

[http://irail.be/stations] has French, Dutch and English names (if applicaple) for most of the stops.

However, [http://graph.irail.be/sncb/stops] only gives the french names. As the latter one should be the preferred one (the former doesn't have all the stops), it would be useful to include the names as well.

GTFS file sorting includes lock file

when calling readdir in getLatestGtfsSource, the lock file is also included. This file doesn't end on .zip, causing the entire name to be interpreted as a Date.

https://github.com/julianrojas87/linked-connections-server/blob/master/lib/utils/utils.js#L493

2018-06-22T12:13:31.613Z - error:  RangeError: Invalid time value
    at Date.toISOString (<anonymous>)
    at Utils.getLatestGtfsSource (linked-connections-server/lib/utils/utils.js:504:125)
    at <anonymous>

This doesn't seem to have a negative impact on the correct working, but should be resolved nonetheless. (Seems like low priority to me)

Weird bug in real-time processing of SNCB data

https://graph.irail.be/sncb/connections?departureTime=2018-10-31T07:05:00.000Z
→ in this page, this connection: http://irail.be/connections/8891553/20181031/L558 takes a long time to do a short trip...

This is the actual trip: https://irail.be/vehicle/L558/20181031 (8:06 tot 8:13)

The connection:

"@id": "http://irail.be/connections/8891553/20181031/L558"
"@type": "Connection"
arrivalStop: "http://irail.be/stations/NMBS/008891629"
arrivalTime: "2018-10-31T09:28:00.000Z"
departureStop: "http://irail.be/stations/NMBS/008891553"
departureTime: "2018-10-31T07:06:00.000Z"
direction: "Malines"
"gtfs:dropOffType": "gtfs:Regular"
"gtfs:pickupType": "gtfs:Regular"
"gtfs:route": "http://irail.be/vehicle/L558"
"gtfs:trip": "http://irail.be/vehicle/L558/20181031"

Wrong arrivalTime in connection

https://graph.irail.be/sncb/connections?departureTime=2019-01-10T08:16:00.000Z

connection ID: http://irail.be/connections/8896008/20190110/IC409

This connection (Kortrijk - Harelbeke) has the wrong arrivalTime, that of Welkenraedt.

{
@id: "http://irail.be/connections/8896008/20190110/IC409",
@type: "Connection",
departureStop: "http://irail.be/stations/NMBS/008896008",
arrivalStop: "http://irail.be/stations/NMBS/008896115",
departureTime: "2019-01-10T08:17:00.000Z",
arrivalTime: "2019-01-10T11:11:00.000Z",
gtfs:trip: "http://irail.be/vehicle/IC409/20190110",
gtfs:route: "http://irail.be/vehicle/IC409",
direction: "Welkenraedt",
gtfs:pickupType: "gtfs:Regular",
gtfs:dropOffType: "gtfs:Regular",
departureDelay: 0,
arrivalDelay: 0
}

Differential LC generation for new static version updates

Make the static LC generation process to store only differential updates with respect to the latest available version (if any) and emit lc:UnscheduledConnections on LDES for removed Connections. Also make sure that overnight trips during the "cutoff date" are properly updated in the new version.

An in-range update of gtfsrt2lc is breaking the build 🚨

The dependency gtfsrt2lc was updated from 1.2.0 to 1.2.1.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

gtfsrt2lc is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details

Commits

The new version differs by 7 commits.

  • eb1a2f7 Merge pull request #37 from linkedconnections/development
  • 7cd5160 1.2.1
  • 35027ca Fix for date-fns last version syntax
  • eaad9e1 Merge pull request #36 from linkedconnections/master
  • e8953d7 Merge pull request #35 from linkedconnections/greenkeeper/initial
  • 510359a chore(package): update lockfile package-lock.json
  • 467fb1d chore(package): update dependencies

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

An in-range update of fast-csv is breaking the build 🚨

The dependency fast-csv was updated from 4.0.3 to 4.1.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

fast-csv is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details

Commits

The new version differs by 4 commits.

  • 682710d v4.1.0
  • b9dd314 Merge pull request #327 from C2FO/v4.1.0-rc
  • 22e4fb7 Added benchmarks for files of 1000 and 10000
  • c0d8f72 Added headers event #321

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Memento and next page strategy

Current state

If you follow next page links, while following, the page may already be out of date. Linked Connections Server needs to have a request header Accept-DateTime added in order to show the right next page for sure.

This means that in some cases, a wrong next page will return.

Suggested solution

Slightly redesign the next page strategy… When requesting a page:

  • Redirect to the right ?departureTime=... page
  • add a content-location header to the response which indicates the URL of this archived page

[High prio] Accept valid ISO8601 date stamps

Currently only dates formatted in UTC are accepted. Alternate timezones, which are accepted in ISO8601, should also be accepted.

Complete date plus hours, minutes and seconds:
YYYY-MM-DDThh:mm:ssTZD (eg 1997-07-16T19:20:30+01:00)
Complete date plus hours, minutes, seconds and a decimal fraction of a
second
YYYY-MM-DDThh:mm:ss.sTZD (eg 1997-07-16T19:20:30.45+01:00)

where:

 YYYY = four-digit year
 MM   = two-digit month (01=January, etc.)
 DD   = two-digit day of month (01 through 31)
 hh   = two digits of hour (00 through 23) (am/pm NOT allowed)
 mm   = two digits of minute (00 through 59)
 ss   = two digits of second (00 through 59)
 s    = one or more digits representing a decimal fraction of a second
 TZD  = time zone designator (Z or +hh:mm or -hh:mm)

https://www.w3.org/TR/NOTE-datetime

Currently UTC is a hardcoded requirement, even while the date might parse correctly:
https://github.com/julianrojas87/linked-connections-server/blob/master/src/routes/page-finder.js#L37

Example:
https://graph.irail.be/sncb/connections?departureTime=2018-03-09T12:00:00.000+01:00
won't be accepted and will be rewritten to the actual time
https://graph.irail.be/sncb/connections?departureTime=2018-03-09T18:20:00.000Z

Doesn't work on windows

datasets doesn't work on windows as the path contains illegal characters like :.
In order to achieve maximum interopability, it might be a good idea to find a way around this, e.g by using dashes, underscores and dots as separators.

Connections travel back in time

As mentioned earlier on slack, some connections travel back in time.

For example, in [https://graph.irail.be/sncb/connections?departureTime=2019-10-31T09:23:00.000Z], the connection with id http://irail.be/connections/8813003/20191031/S11981 seems to depart one minute before it arrived.

Note that this is probably related to incorrect delays.

As reference, the data I got from the server is copied below:

@id "http://irail.be/connections/8813003/20191031/S11981"
@type "Connection"
departureStop "http://irail.be/stations/NMBS/008813003"
arrivalStop "http://irail.be/stations/NMBS/008813037"
departureTime "2019-10-31T09:25:00.000Z"
arrivalTime "2019-10-31T09:24:00.000Z"
departureDelay 420
arrivalDelay 300
direction "Nivelles"
gtfs:trip "http://irail.be/vehicle/S11981/20191031"
gtfs:route "http://irail.be/vehicle/S11981"
gtfs:pickupType "gtfs:Regular"
gtfs:dropOffType "gtfs:NotAvailable"

Use correct ids to identify trips and vehicles, include direction

Currently a departure contains the following information:

> "@id": "#1510063200000884906488____%3A007%3A%3A8846201%3A8400219%3A3%3A1501%3A20171208",
> "@type": "Connection",
> "departureStop": "http://irail.be/stations/NMBS/008849064",
> "arrivalStop": "http://irail.be/stations/NMBS/008400219",
> "departureTime": "2017-11-07T14:00:00.000Z",
> "arrivalTime": "2017-11-07T14:01:00.000Z",
> "gtfs:trip": "http://irail.be/trips/88____%3A007%3A%3A8846201%3A8400219%3A3%3A1501%3A20171208",
> "gtfs:route": "http://irail.be/routes/327
  • @id should be formatted as a departureConnection id: http://irail.be/connections/<station hafas id>/<yyyymmdd>/<train id>, for example http://irail.be/connections/8892007/20171107/IC532. Hafas id is simply the station id without the leading zeros.

  • gtfs:trip should be the trip id, which is a train trip on a certain day (correct me if I'm wrong): http://irail.be/vehicle/{VehicleShortName}/{yyyymmdd} where VehicleShortNameis for example IC532

  • gtfs:route should identify a route, with an identifier in the format http://irail.be/vehicle/{VehicleShortName}

  • A field 'direction' should be included, containing a string which identifies the station name where this train is heading.

Fix semantics of Stops and Routes

  • Use rdfs:label instead of schema:name.
  • Change gtfs:Entrance_Exit to gtfs:EntranceOrExit.
  • Remove gtfs:BoardingArea.
  • Remove gtfs:route from Connections.
  • Publish Trips and Shapes.

Don't use inconsistent date for folder naming in RT data

The realtime folder structure contains folders with names like '2018_4_12'. These names are hard to generate, especially now we don't use a fixed time interval anymore.
Instead, these folders should be named '2018-04-12', identical to the first part of each file it contains.
In other words, the folder name should be the shared prefix of every file it contains.

Example:
I have the static file 2018-04-07T17:34:00.000Z.jsonld.gz. I want to get the realtime data for this file.

I need to navigate to /real_time/{last_version}/{special_date}/2018-04-07T17:34:00.000Z.jsonld.
last_version is identical to the version field I used in the static file path. This is perfect, I can just reuse the same string.
Howevern, special_date requires me to take a substring of the filename, parse it as a date, print it as another date. By using the prefix of the file, I could just use a 10-digit substring.

Update gtfs2lc

gtfs2lc has updated to 0.8.2. It contains some critical bug fixes. Can you update you linked-connections-server?

utils.js is not stateless

The utils class which contains a lot of the logic for the web server is not stateless making it hard to reuse the functions. Especially staticFragments gives a thight coupling to the functions making it hard to use a different datastructure for storage of the fragments. Removing this from the state would at least make it clear that the utils functions depend on this object.

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on Greenkeeper branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because it uses your CI build statuses to figure out when to notify you about breaking changes.

Since we didn’t receive a CI status on the greenkeeper/initial branch, it’s possible that you don’t have CI set up yet. We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.

If you have already set up a CI for this repository, you might need to check how it’s configured. Make sure it is set to run on all new branches. If you don’t want it to run on absolutely every branch, you can whitelist branches starting with greenkeeper/.

Once you have installed and configured CI on this repository correctly, you’ll need to re-trigger Greenkeeper’s initial pull request. To do this, please click the 'fix repo' button on account.greenkeeper.io.

Problem with etag header

When giving back the etag, take into account to content-type and return a different etag for every format

[Low priority] Object can appear twice

On graph.irail.be, the following excerpt can be found in the result for https://graph.irail.be/sncb/connections?departureTime=2018-02-15T13:40:00.000Z

  {
    "@id": "http:\/\/irail.be\/connections\/8821717\/20180215\/IC4314",
    "@type": "Connection",
    "departureStop": "http:\/\/irail.be\/stations\/NMBS\/008821717",
    "arrivalStop": "http:\/\/irail.be\/stations\/NMBS\/008832458",
    "departureTime": "2018-02-15T13:43:00.000Z",
    "arrivalTime": "2018-02-15T13:49:00.000Z",
    "direction": "Hamont",
    "gtfs:trip": "http:\/\/irail.be\/vehicle\/IC4314\/20180215",
    "gtfs:route": "http:\/\/irail.be\/vehicle\/IC4314",
    "gtfs:pickupType": "gtfs:Regular",
    "gtfs:dropOffType": "gtfs:Regular"
  },
  {
    "@id": "http:\/\/irail.be\/connections\/8821717\/20180215\/IC4314",
    "@type": "Connection",
    "departureStop": "http:\/\/irail.be\/stations\/NMBS\/008821717",
    "arrivalStop": "http:\/\/irail.be\/stations\/NMBS\/008832458",
    "departureTime": "2018-02-15T13:43:00.000Z",
    "arrivalTime": "2018-02-15T13:49:00.000Z",
    "direction": "Hasselt",
    "gtfs:trip": "http:\/\/irail.be\/vehicle\/IC4314\/20180215",
    "gtfs:route": "http:\/\/irail.be\/vehicle\/IC4314",
    "gtfs:pickupType": "gtfs:Regular",
    "gtfs:dropOffType": "gtfs:Regular"
  }

Only one of those should be shown as they are identical

context: missing gtfs:Regular and others

The URI definitions for gtfs:Regular and others are missing in the @context. When parsing the JSON-LD, the prefixes are kept as is right now, which results in wrong Linked Data.

how to use with sparse/query-only data sources?

I want to built Linked Connections wrapping sparse data sources, which I need to query for connections on demand. This means that:

  • I need to be able to decide how to fetch the sparse data.
  • There is nothing stored as files.
  • The Linked Connections server should only let me fetch whatever is needed to answer the client's query.

How would this work with linked-connections-server?

URIs from GTFS are escaped

When the original GTFS file contains URIs, then those URIs are escaped when the GTFS is converted into Linked Connections.

The following configuration

{
    "companyName": "dsb",
    "keywords": ["Train", "Stops"],
    "geographicArea": "http://sws.geonames.org/2623032",
    "downloadUrl": "./gtfs/DSB.zip",
    "downloadOnLaunch": false,
    "updatePeriod": "0 0 0 0 1 *",
    "fragmentSize": 500000,
    "realTimeData": false,
    "baseURIs": {
    "stop": "{connection.departureStop}",
    "connection": "http://dk.lc.bertmarcelis.be/connections/dsb/{connection.departureTime(YYYYMMDD)}/{connection.$
    "trip": "http://dk.lc.bertmarcelis.be/vehicle/dsb/{trips.trip_id}/{connection.departureTime(YYYYMMDD)}",
    "route": "http://dk.lc.bertmarcelis.be/routes/dsb/{routes.route_id}"
            }

results in

 {
      "@id": "http://dk.lc.bertmarcelis.be/connections/dsb/20181123/http%3A%2F%2Fdk.lc.bertmarcelis.be%2F000008600566/55407198",
      "@type": "Connection",
      "departureStop": "http%3A%2F%2Fdk.lc.bertmarcelis.be%2F000008600566",
      "arrivalStop": "http%3A%2F%2Fdk.lc.bertmarcelis.be%2F000008600567",
      "departureTime": "2018-11-23T18:43:00.000Z",
      "arrivalTime": "2018-11-23T18:44:00.000Z",
      "gtfs:trip": "http://dk.lc.bertmarcelis.be/vehicle/dsb/55407198/20181123",
      "gtfs:route": "http://dk.lc.bertmarcelis.be/routes/dsb/11748_2",
      "direction": "Svendborg St.",
      "gtfs:pickupType": "gtfs:Regular",
      "gtfs:dropOffType": "gtfs:Regular"
    },

When the departureStop should be http://dk.lc.bertmarcelis.be/000008600566.

This is an issue as the GTFS file contains stops from 3 different base URIs, meaning that they have to be defined in the GTFS file. A suggested solution would be to not encode values if they are at the first position in the template (if they are at the position where you'd expect http), or to only encode them if the template starts with http, or to pass the base URIs as a nested object which specifies if encoding should be disabled, e.g. {"format": "{...}", "escape": false} .

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.