Code Monkey home page Code Monkey logo

interpolation's Introduction

A modular, open-source search engine for our world.

Pelias is a geocoder powered completely by open data, available freely to everyone.

Local Installation · Cloud Webservice · Documentation · Community Chat

What is Pelias?
Pelias is a search engine for places worldwide, powered by open data. It turns addresses and place names into geographic coordinates, and turns geographic coordinates into places and addresses. With Pelias, you’re able to turn your users’ place searches into actionable geodata and transform your geodata into real places.

We think open data, open source, and open strategy win over proprietary solutions at any part of the stack and we want to ensure the services we offer are in line with that vision. We believe that an open geocoder improves over the long-term only if the community can incorporate truly representative local knowledge.

Pelias

A modular, open-source geocoder built on top of Elasticsearch for fast and accurate global search.

What's a geocoder do anyway?

Geocoding is the process of taking input text, such as an address or the name of a place, and returning a latitude/longitude location on the Earth's surface for that place.

geocode

... and a reverse geocoder, what's that?

Reverse geocoding is the opposite: returning a list of places near a given latitude/longitude point.

reverse

What are the most interesting features of Pelias?

  • Completely open-source and MIT licensed
  • A powerful data import architecture: Pelias supports many open-data projects out of the box but also works great with private data
  • Support for searching and displaying results in many languages
  • Fast and accurate autocomplete for user-facing geocoding
  • Support for many result types: addresses, venues, cities, countries, and more
  • Modular design, so you don't need to be an expert in everything to make changes
  • Easy installation with minimal external dependencies

What are the main goals of the Pelias project?

  • Provide accurate search results
  • Work equally well for a small city and the entire planet
  • Be highly configurable, so different use cases can be handled easily and efficiently
  • Provide a friendly, welcoming, helpful community that takes input from people all over the world

Where did Pelias come from?

Pelias was created in 2014 as an early project at Mapzen. After Mapzen's shutdown in 2017, Pelias is now part of the Linux Foundation.

How does it work?

Magic! (Just kidding) Like any geocoder, Pelias combines full text search techniques with knowledge of geography to quickly search over many millions of records, each representing some sort of location on Earth.

The Pelias architecture has three main components and several smaller pieces.

A diagram of the Pelias architecture.

Data importers

The importers filter, normalize, and ingest geographic datasets into the Pelias database. Currently there are six officially supported importers:

We are always discussing supporting additional datasets. Pelias users can also write their own importers, for example to import proprietary data into your own instance of Pelias.

Database

The underlying datastore that does most of the query heavy-lifting and powers our search results. We use Elasticsearch. Currently versions 7 and 8 are supported.

We've built a tool called pelias-schema that sets up Elasticsearch indices properly for Pelias.

Frontend services

This is where the actual geocoding process happens, and includes the components that users interact with when performing geocoding queries. The services are:

  • API: The API service defines the Pelias API, and talks to Elasticsearch or other services as needed to perform queries.
  • Placeholder: A service built specifically to capture the relationship between administrative areas (a catch-all term meaning anything like a city, state, country, etc). Elasticsearch does not handle relational data very well, so we built Placeholder specifically to manage this piece.
  • PIP: For reverse geocoding, it's important to be able to perform point-in-polygon(PIP) calculations quickly. The PIP service is is very good at quickly determining which admin area polygons a given point lies in.
  • Libpostal: Pelias uses the libpostal project for parsing addresses using the power of machine learning. We use a Go service built by the Who's on First team to make this happen quickly and efficiently.
  • Interpolation: This service knows all about addresses and streets. With that knowledge, it is able to supplement the known addresses that are stored directly in Elasticsearch and return fairly accurate estimated address results for many more queries than would otherwise be possible.

Dependencies

These are software projects that are not used directly but are used by other components of Pelias.

There are lots of these, but here are some important ones:

  • model: provide a single library for creating documents that fit the Pelias Elasticsearch schema. This is a core component of our flexible importer architecture
  • wof-admin-lookup: A library for performing administrative lookup using point-in-polygon math. Previously included in each of the importers but now only used by the PIP service.
  • query: This is where most of our actual Elasticsearch query generation happens.
  • config: Pelias is very configurable, and all of it is driven from a single JSON file which we call pelias.json. This package provides a library for reading, validating, and working with this configuration. It is used by almost every other Pelias component
  • dbclient: A Node.js stream library for quickly and efficiently importing records into Elasticsearch

Helpful tools

Finally, while not part of Pelias proper, we have built several useful tools for working with and testing Pelias

Notable examples include:

  • acceptance-tests: A Node.js command line tool for testing a full planet build of Pelias and ensuring everything works. Familiarity with this tool is very important for ensuring Pelias is working. It supports all Pelias features and has special facilities for testing autocomplete queries.
  • compare: A web-based tool for comparing different instances of Pelias (for example a production and staging environment). We have a reference instance at pelias.github.io/compare/
  • dashboard: Another web-based tool for providing statistics about the contents of a Pelias Elasticsearch index such as import speed, number of total records, and a breakdown of records of various types.

Documentation

The main documentation lives in the pelias/documentation repository.

Additionally, the README file in each of the component repositories listed above provides more detail on that piece.

Here's an example API response for a reverse geocoding query
$ curl -s "search.mapzen.com/v1/reverse?size=1&point.lat=40.74358294846026&point.lon=-73.99047374725342&api_key={YOUR_API_KEY}" | json
{
    "geocoding": {
        "attribution": "https://search.mapzen.com/v1/attribution",
        "engine": {
            "author": "Mapzen",
            "name": "Pelias",
            "version": "1.0"
        },
        "query": {
            "boundary.circle.lat": 40.74358294846026,
            "boundary.circle.lon": -73.99047374725342,
            "boundary.circle.radius": 500,
            "point.lat": 40.74358294846026,
            "point.lon": -73.99047374725342,
            "private": false,
            "querySize": 1,
            "size": 1
        },
        "timestamp": 1460736907438,
        "version": "0.1"
    },
    "type": "FeatureCollection",
    "features": [
        {
            "geometry": {
                "coordinates": [
                    -73.99051,
                    40.74361
                ],
                "type": "Point"
            },
            "properties": {
                "borough": "Manhattan",
                "borough_gid": "whosonfirst:borough:421205771",
                "confidence": 0.9,
                "country": "United States",
                "country_a": "USA",
                "country_gid": "whosonfirst:country:85633793",
                "county": "New York County",
                "county_gid": "whosonfirst:county:102081863",
                "distance": 0.004,
                "gid": "geonames:venue:9851011",
                "id": "9851011",
                "label": "Arlington, Manhattan, NY, USA",
                "layer": "venue",
                "locality": "New York",
                "locality_gid": "whosonfirst:locality:85977539",
                "name": "Arlington",
                "neighbourhood": "Flatiron District",
                "neighbourhood_gid": "whosonfirst:neighbourhood:85869245",
                "region": "New York",
                "region_a": "NY",
                "region_gid": "whosonfirst:region:85688543",
                "source": "geonames"
            },
            "type": "Feature"
        }
    ],
    "bbox": [
        -73.99051,
        40.74361,
        -73.99051,
        40.74361
    ]
}

How can I install my own instance of Pelias?

To try out Pelias quickly, use our Docker setup. It uses Docker and docker-compose to allow you to quickly set up a Pelias instance for a small area (by default Portland, Oregon) in under 30 minutes.

Do you offer a free geocoding API?

You can sign up for a trial API key at Geocode Earth. A commercial service has been operated by the core development team behind Pelias since 2014 (previously at search.mapzen.com). Discounts and free plans are available for free and open-source software projects.

What's it built with?

Pelias itself (the import pipelines and API) is written in Node.js, which makes it highly accessible for other developers and performant under heavy I/O. It aims to be modular and is distributed across a number of Node packages, each with its own repository under the Pelias GitHub organization.

For a select few components that have performance requirements that Node.js cannot meet, we prefer to write things in Go. A good example of this is the pbf2json tool that quickly converts OSM PBF files to JSON for our OSM importer.

Elasticsearch is our datastore of choice because of its unparalleled full text search functionality, scalability, and sufficiently robust geospatial support.

Contributing

Gitter

We built Pelias as an open source project not just because we believe that users should be able to view and play with the source code of tools they use, but to get the community involved in the project itself.

Especially with a geocoder with global coverage, it's just not possible for a small team to do it alone. We need you.

Anything that we can do to make contributing easier, we want to know about. Feel free to reach out to us via Github, Gitter, email, or Twitter. We'd love to help people get started working on Pelias, especially if you're new to open source or programming in general.

We have a list of Good First Issues for new contributors.

Both this meta-repo and the API service repo are worth looking at, as they're where most issues live. We also welcome reporting issues or suggesting improvements to our documentation.

The current Pelias team can be found on Github as missinglink and orangejulius.

Members emeritus include:

interpolation's People

Contributors

arne-cl avatar blackmad avatar dianashk avatar greenkeeper[bot] avatar greenkeeperio-bot avatar joxit avatar millette avatar missinglink avatar orangejulius avatar tigerlily-he avatar trescube avatar vicchi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

interpolation's Issues

don't store problematic house numbers in db

when we see house numbers containing ranges or suffixes that we don't understand, we should drop them from the index instead of adding them in some sort of corrupted/distorted form.

eg. 1-1 or 1 flat b or 1/2

trim address database by discarding some redundant vertices

eg: in the following example, there is likely no need to interpolate between housenumbers 10 and 11

├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤
│ 1     │ 1  │ OA     │ 10          │ -41.2882585 │ 174.7670996 │ L      │ -41.2882734 │ 174.767233  │
├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤
│ 19    │ 1  │ VERTEX │ 10.206      │             │             │        │ -41.288228  │ 174.767242  │
├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤
│ 18    │ 1  │ VERTEX │ 10.362      │             │             │        │ -41.288304  │ 174.767227  │
├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤
│ 20    │ 1  │ VERTEX │ 10.467      │             │             │        │ -41.28817   │ 174.767242  │
├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤
│ 21    │ 1  │ VERTEX │ 10.62       │             │             │        │ -41.288136  │ 174.767242  │
├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤
│ 2     │ 1  │ OA     │ 11          │ -41.2880114 │ 174.7674035 │ R      │ -41.2880549 │ 174.7672097 │
├───────┼────┼────────┼─────────────┼─────────────┼─────────────┼────────┼─────────────┼─────────────┤

there is also a possibility to remove vertices which are a short distance from each another or are less than a certain angle different.

An in-range update of tape is breaking the build 🚨

Version 4.9.1 of tape was just published.

Branch Build failing 🚨
Dependency tape
Current Version 4.9.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

tape is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • ci/circleci Your tests passed on CircleCI! Details
  • continuous-integration/travis-ci/push The Travis CI build could not complete due to an error Details

Commits

The new version differs by 10 commits.

  • 050b318 v4.9.1
  • 73232c0 [Dev Deps] update js-yaml
  • 8a2d29b [Deps] update has, for-each, resolve, object-inspect
  • c6f5313 [Tests] add eclint and eslint, to enforce a consistent style
  • 45788a5 [Dev Deps] update concat-stream
  • ec4a71d [fix] Fix bug in functionName regex during stack parsing
  • 7261ccc Merge pull request #433 from mcnuttandrew/add-trb
  • 6cbc53e Add tap-react-browser
  • 9d501ff [Dev Deps] use ~ for dev deps; update to latest nonbreaking
  • 24e0a8d Fix spelling of "parameterize"

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

realloc failed during char_array_push

I get the error

> ./interpolate polyline street.db < road_network.polylines
realloc failed during char_array_push

I'm using the road_network.polylines file linked to from the Polylines README. This error occurred 15 hours after starting the command. I'm using the Production branch of the interpolation repository.

continued... house number formats

from osm:

  • 1A,1B,1C,1D,1E,1F,1G,1H,1J,1K,1L,1M,1N,1P,2A,2B,2C,2D,2E,2F,2G,2H,2J
  • 95-A1

non-ascii chars?

could not reliably parse housenumber 28А
could not reliably parse housenumber 28Б
could not reliably parse housenumber 28В
could not reliably parse housenumber 28Г
"А".codePointAt(0)
1040
"A".codePointAt(0)
65
  • 9-RD
  • 158號

investigate - willow ave, hoboken

Search for this address: 1601 Willow Avenue, Hoboken, NJ and then click on the Willow Avenue. You can see at either end of the street there are far reaching offshoots.

screen shot 2016-10-17 at 1 06 42 pm

use openaddresses unit numbers where available

some OA extracts have data in the UNIT column which we can use:

(this won't have a huge impact, so low priority for now).

LON,LAT,NUMBER,STREET,UNIT,CITY,DISTRICT,REGION,POSTCODE,ID,HASH
-122.3897574,37.7898033,75,Folsom Street,901,,,,94105,,cb3f802c0fbb6877
-122.3897574,37.7898033,75,Folsom Street,900,,,,94105,,c82b7876173c6e7f
-122.3897574,37.7898033,75,Folsom Street,809,,,,94105,,6f7a0839e626d6c6
-122.3897574,37.7898033,75,Folsom Street,808,,,,94105,,17074625c0e7ab25
-122.3897574,37.7898033,75,Folsom Street,807,,,,94105,,8a7a1894fe1f6a9c
-122.3897574,37.7898033,75,Folsom Street,806,,,,94105,,208c035cb55732f2
-122.3897574,37.7898033,75,Folsom Street,805,,,,94105,,c017139fa68a3c90
-122.3897574,37.7898033,75,Folsom Street,804,,,,94105,,9c96acf7427552c9
-122.3897574,37.7898033,75,Folsom Street,803,,,,94105,,d707e4e7dc63e8d2
-122.3897574,37.7898033,75,Folsom Street,802,,,,94105,,9af132f8d5db3c1f
-122.3897574,37.7898033,75,Folsom Street,801,,,,94105,,1607be579cbffce1
-122.3897574,37.7898033,75,Folsom Street,800,,,,94105,,63f931642eda497a

go live checklist

  • create a web service interface
  • create demo
  • dockerize builds
  • dockerize server
  • improved unit test coverage
  • refactor to reduce individual module size
  • readme + wikis
  • move automated build process to a production server (pelias/pelias#468)
  • write/update blog post

An in-range update of serve-index is breaking the build 🚨

Version 1.9.0 of serve-index just got published.

Branch Build failing 🚨
Dependency serve-index
Current Version 1.8.0
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

serve-index is a direct dependency of this project this is very likely breaking your project right now. If other packages depend on you it’s very likely also breaking them.
I recommend you give this issue a very high priority. I’m sure you can resolve this 💪

Status Details
  • continuous-integration/travis-ci/push The Travis CI build failed Details

Release Notes 1.9.0
  • Set X-Content-Type-Options: nosniff header
  • deps: [email protected]
  • deps: [email protected]
    • Allow colors in workers
    • Deprecated DEBUG_FD environment variable set to 3 or higher
    • Fix DEBUG_MAX_ARRAY_LENGTH
    • Fix error when running under React Native
    • Use same color for same namespace
    • deps: [email protected]
  • deps: http-errors@~1.6.1
  • deps: mime-types@~2.1.15
    • Add new mime types
    • Add audio/mp3
Commits

The new version differs by 35 commits.

There are 35 commits in total.

See the full diff

Not sure how things should work exactly?

There is a collection of frequently asked questions and of course you may always ask my humans.


Your Greenkeeper Bot 🌴

improve awareness of apartments/unit numbers

the current house number algorithm does not account for apartment and units.

need to improve this so it handles cases such as 1/12 (apt 1, no 12) which is currently being interpreted as parseFloat('1/12') == 1.

`interpolate oa` stops with no addresses imported

I downloaded all US addresses from OpenAddresses, unzipped the four files, and then appended them into a single csv file with cat us/**/*.csv > all.csv. But then

./interpolate oa address.db street.db < all.csv

on master gave output of

> ./interpolate oa address.db street.db < all.csv
0       0/sec
0       0/sec
0       0/sec
0       0/sec

and on production gave output of

> ./interpolate oa address.db street.db < all.csv
0       0/sec

The address.db file is only 12KB, so not much happened.

nearest street - max distance

currently the nearest street API does not have a max distance value, so it will return all streets whose MBR intersects the input point.

this can result in very large, yet distant roads being returned, such as in this example: http://interpolation.wiz.co.nz/street/near/geojson?lat=40.74429825919934&lon=-73.9896583557129 where the brooklynqueens expressway is returned.

adding a max_distance parameter would decrease the amount of results returned and, as a result, would speed up serving and parsing.

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml
  • The new Node.js version is in-range for the engines in 1 of your package.json files, so that was left alone

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Detect OS and use appropriate PBF2JSON build

In the build script, the following line should be changed to use the build that corresponds to the host operating system.

PBF2JSON_BIN=${PBF2JSON_BIN:-"$DIR/../node_modules/pbf2json/build/pbf2json.linux-x64"};

See #72 for original comment

"Error: invalid names array" when creating database from polylines

I got an error "invalid names array" when making the street.db file from the us-midwest-latest.polylines file, which was created with pbf streets us-midwest-latest.osm.pbf > us-midwest-latest.polylines, which was downloaded from the Geofabrik mirror.

Is this most likely an issue with the original .osm.pbf file, with the pbf conversion, or with the interpolation script?

> ./interpolate polyline street.db < us-midwest-latest.polylines
/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/lib/Street.js:24
  if( !Array.isArray( names ) || !names.length ){ throw new Error( 'invalid names array' ); }
                                                  ^

Error: invalid names array
    at Street.setNames (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/lib/Street.js:24:57)
    at DestroyableTransform._transform (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/stream/street/augment.js:25:12)
    at DestroyableTransform.Transform._read (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:184:10)
    at DestroyableTransform.Transform._write (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:172:83)
    at doWrite (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:418:64)
    at writeOrBuffer (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:407:5)
    at DestroyableTransform.Writable.write (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:334:11)
    at DestroyableTransform.ondata (/disk/agedisk1/medicare.work/doyle-DUA18266/barronk/npi-geocode/lib/interpolation/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:619:20)
    at emitOne (events.js:116:13)
    at DestroyableTransform.emit (events.js:211:7)

Timeout of 250ms exceeded

From the API logs:
[libpostal] http://localhost:8080/parse?address=%D7%9E%D7%9B%D7%9C%D7%9C%D7%AA%20%D7%A4%D7%A8%D7%A1: {"timeout":250,"code":"ECONNABORTED","errno":"ETIME","retries":3} error: [api] [middleware:interpolation] Timeout of 250ms exceeded

Both libpostal and interpolation are running in containers:

  • libpostal > pelias/go-whosonfirst-libpostal
  • interpolation > pelias/interpolation

docker stats shows that CPU usage as ~20% max and MEM usage is ~4GB / 32GB
NET I/O is 59M / 2M but i'm not so sure about it since it stuck this way even when i'm not making calls to the api.

How to support interpolation for Berlin/Germany?

Dear Pelias maintainers,

I use the dockerfiles repository and would actually like to set up a local instance of Pelias with the data of Berlin/Germany. I've already searched for possibilities to adjust the data pathes and files in a config file called pelias.json.
The interpolation part for Portland, Oregon is the following:

"interpolation": {
  "download": {
    "tiger": {
      "datapath": "/data/tiger",
      "states": [
        {
          "state_code": 41
        }
      ]
    }
  }
}

Until now I've found no way to configure this part for Berlin.

But I executed the following code of your readme and got this error message (I also tried using sudo):

$ curl -s http://missinglink.files.s3.amazonaws.com/berlin.gz | gzip -d > /tmp/data/berlin.0sv
$ bash: /tmp/data/berlin.0sv: Permission denied

Do you know a way to configure the interpolation for Berlin? I would be very happy if anyone could help me.

investigate - leipziger strasse, berlin

Leipziger Strasse - Berlin

The L/R parity of the addresses at the right of the image seem to be upset where the road splits.

Also the addresses on the northern side of the street seem to be projecting onto a different line string than the one pictured, this is technically correct, needs more investigation.

parity

rowid   id  source  housenumber lat lon parity  proj_lat    proj_lon
150807908   2201666 OA  2   52.5095987  13.3801595  L   52.5097367  13.3801364
150807909   2201666 OA  3   52.5096255  13.38059    L   52.5097624  13.3805699
150807910   2201666 OA  4   52.5097041  13.3818573  L   52.5098407  13.3818311
150807998   2201666 VERTEX  5.51                52.509914   13.382863
150807997   2201666 VERTEX  6.298               52.509952   13.383401
150807996   2201666 VERTEX  6.527               52.509963   13.383558
150807994   2201666 VERTEX  7.491               52.510005   13.384217
150807993   2201666 VERTEX  7.925               52.510021   13.384515
150807911   2201666 OA  8   52.5098709  13.3845892  L   52.5100239  13.384566
150807912   2201666 OA  9   52.5098928  13.3850131  L   52.5100476  13.3849897
150807992   2201666 VERTEX  11.584              52.51007    13.385389
150807913   2201666 OA  12  52.5099245  13.3854778  L   52.5100739  13.3854532
150807914   2201666 OA  13  52.5099389  13.3856999  L   52.5100873  13.3856762
150807915   2201666 OA  16  52.5100065  13.387263   L   52.5101797  13.3872353
150807990   2201666 VERTEX  16.377              52.510189   13.387392
150807916   2201666 OA  20  52.5100435  13.3889339  L   52.5102726  13.3888996
150807917   2201666 OA  21  52.5101295  13.3891379  L   52.5102846  13.3891141
150807918   2201666 OA  26  52.5101644  13.3893705  L   52.510298   13.38935
150807988   2201666 VERTEX  26.564              52.510307   13.389508
150807986   2201666 VERTEX  27.972              52.510334   13.389902
150807984   2201666 VERTEX  29.419              52.510356   13.390308
150807919   2201666 OA  30  52.5101643  13.3905073  L   52.5103669  13.3904706
150807920   2201666 OA  31  52.51019    13.3909241  L   52.5103922  13.3908918
150807921   2201666 OA  32  52.51023    13.3912565  L   52.5104121  13.3912274
150807922   2201666 OA  33  52.5102607  13.3914482  L   52.5104237  13.3914221
150807981   2201666 VERTEX  34.73               52.510448   13.391833
150807980   2201666 VERTEX  35.722              52.510459   13.392069
150807923   2201666 OA  36  52.5103374  13.3921994  L   52.5104708  13.3921324
150807979   2201666 VERTEX  36.086              52.510475   13.392155
150807978   2201666 VERTEX  36.309              52.510486   13.392214
150807977   2201666 VERTEX  36.585              52.510513   13.392276
150807976   2201666 VERTEX  36.882              52.51054    13.392345
150807975   2201666 VERTEX  37.163              52.510562   13.392414
150807974   2201666 VERTEX  37.427              52.510574   13.392484
150807924   2201666 OA  39  52.5103787  13.3929465  L   52.5105956  13.3929174
150807925   2201666 OA  40  52.5103299  13.393993   L   52.5106524  13.3939489
150807926   2201666 OA  41  52.5099867  13.3942683  L   52.5106596  13.3940738
150807927   2201666 OA  42  52.5104712  13.3947187  L   52.5107209  13.3946465
150807928   2201666 OA  43  52.5104445  13.3956765  L   52.5107818  13.3956332
150807929   2201666 OA  44  52.5101065  13.3959325  L   52.510791   13.395827
150807930   2201666 OA  45  52.5105686  13.3963765  L   52.5108192  13.3963463
150807931   2201666 OA  46  52.5105631  13.3973336  L   52.510873   13.3972849
150807932   2201666 OA  47  52.5101975  13.397633   L   52.510887   13.3975247
150807933   2201666 OA  48  52.5106972  13.3999321  L   52.5110253  13.3998569
150807934   2201666 OA  49  52.510336   13.4002175  L   52.5110421  13.4000556
150807935   2201666 OA  50  52.5108628  13.4008355  L   52.5111123  13.4007562
150807936   2201666 OA  51  52.5109335  13.4018256  L   52.5112776  13.4015637
150807937   2201850 OA  54  52.5116085  13.3999422  L   52.5111946  13.4000466
150807938   2201850 OA  55  52.5113904  13.3990027  L   52.5111069  13.3990501
150807939   2201850 OA  56  52.511342   13.3982004  L   52.5110573  13.3982496
150807940   2201850 OA  57  52.5114081  13.3974673  L   52.5110117  13.3975358
150807941   2201850 OA  58  52.5114242  13.3972 L   52.5109949  13.3972742
150807942   2201850 OA  59  52.5114337  13.3969445  L   52.5109788  13.3970231
150807943   2201850 OA  60  52.5112057  13.3959759  L   52.5109207  13.3960183
150807944   2201850 OA  61  52.5111502  13.3951771  L   52.510874   13.3952215
150807945   2201850 OA  62  52.5111303  13.3944697  L   52.5108321  13.3945177
150807946   2201850 OA  63  52.511147   13.3941998  L   52.5108172  13.3942458
150807947   2201850 OA  64  52.5111643  13.3939305  L   52.5108048  13.3939956
150807948   2201850 OA  65  52.5110235  13.3931033  L   52.5107505  13.3931577
150807949   2201850 OA  66  52.5109746  13.3923047  L   52.5106852  13.392506
150807950   2201666 OA  96  52.510552   13.3913378  R   52.5104199  13.3913589
150807951   2201666 OA  100 52.5105703  13.3909781  R   52.510399   13.3910055
150807982   2201666 VERTEX  100.279             52.510391   13.390871
150807983   2201666 VERTEX  100.86              52.510375   13.390591
150807985   2201666 VERTEX  101.448             52.510356   13.390308
150807987   2201666 VERTEX  102.289             52.510334   13.389902
150807952   2201666 OA  103 52.5104507  13.389534   R   52.5103106  13.3895599
150807953   2201666 OA  104 52.510429   13.3891755  R   52.5102893  13.389197
150807989   2201666 VERTEX  105.71              52.51028    13.389034
150807954   2201666 OA  111 52.5104283  13.3885032  R   52.510252   13.3885296
150807955   2201666 OA  112 52.510312   13.3876676  R   52.5102052  13.3876836
150807991   2201666 VERTEX  112.503             52.510189   13.387392
150807956   2201666 OA  114 52.5103984  13.3864832  R   52.5101376  13.3865249
150807957   2201666 OA  115 52.5103839  13.3861075  R   52.5101154  13.3861505
150807958   2201666 OA  116 52.5103812  13.3858543  R   52.5101005  13.3858992
150807959   2201666 OA  117 52.5103632  13.3855512  R   52.5100825  13.3855961
150807960   2201666 OA  118 52.5103438  13.3852243  R   52.5100631  13.3852668
150807961   2201666 OA  119 52.5103076  13.3849793  R   52.5100492  13.3850184
150807962   2201666 OA  120 52.5102866  13.3846891  R   52.5100329  13.3847275
150807963   2201666 OA  121 52.5101319  13.3844324  R   52.5100175  13.384449
150807995   2201666 VERTEX  121.782             52.510005   13.384217
150807964   2201666 OA  124 52.5100706  13.3835413  R   52.5099631  13.3835598
150807965   2201666 OA  125 52.5100571  13.3833234  R   52.509948   13.3833442
150807966   2201666 OA  126 52.5100483  13.3830408  R   52.5099282  13.3830637
150807967   2201666 OA  127 52.5100186  13.3825624  R   52.5098943  13.3825862
150807968   2201666 OA  128 52.5099984  13.3822367  R   52.5098712  13.3822611
150807969   2201666 OA  129 52.5099587  13.3817376  R   52.5098357  13.3817612
150807970   2201666 OA  130 52.5099292  13.3812625  R   52.5098035  13.3812842
150807971   2201666 OA  131 52.5099174  13.3810728  R   52.5097919  13.381093
150807972   2201666 OA  132 52.5099027  13.3808357  R   52.5097778  13.380854
150807973   2201666 OA  133 52.5098588  13.3801295  R   52.5097376  13.3801498

import runs which result in 0 rows

it's possible that importers yield 0 rows (for tests and smaller extracts).

it seems like the vertices action hangs if the address table contains 0 rows, also there are errors produced from trying to close prepared statements that have never been run.

add tests for empty import files and confirm that all scripts still run and exit without error.

Floating point precision may be causing test failures

https://travis-ci.org/pelias/interpolation/jobs/226891165

 ✖ should be equivalent

    -----------------------

      operator: deepEqual

      expected: |-

        [ '5|1|TIGER|208773790|1.0|46.534084|-110.908557|L|46.5340609148833|-110.908556832798', '6|1|TIGER|208773790|99.0|46.534063|-110.910854|L|46.5340530413783|-110.910853927871', '3|1|TIGER|263702490|100.0|46.533874|-110.934964|R|46.5338750002214|-110.934964011328', '4|1|TIGER|208773717|101.0|46.533874|-110.934964|R|46.5338750002214|-110.934964011328', '8|1|VERTEX||117.653||||46.533863|-110.937203', '7|1|VERTEX||125.049||||46.533851|-110.938141', '1|1|TIGER|263702490|198.0|46.53376|-110.94752|R|46.5337752241147|-110.947520259902', '2|1|TIGER|208773717|199.0|46.53376|-110.94752|R|46.5337752241147|-110.947520259902' ]

      actual: |-

        [ '5|1|TIGER|208773790|1.0|46.534084|-110.90856|L|46.5340609046008|-110.908559832723', '6|1|TIGER|208773790|99.0|46.53406|-110.91085|L|46.5340530550139|-110.910849949699', '3|1|TIGER|263702490|100.0|46.53387|-110.93496|R|46.5338750214159|-110.93496005687', '4|1|TIGER|208773717|101.0|46.53387|-110.93496|R|46.5338750214159|-110.93496005687', '8|1|VERTEX||117.678||||46.533863|-110.937203', '7|1|VERTEX||125.072||||46.533851|-110.938141', '1|1|TIGER|263702490|198.0|46.53376|-110.94752|R|46.5337752241147|-110.947520259902', '2|1|TIGER|208773717|199.0|46.53376|-110.94752|R|46.5337752241147|-110.947520259902' ]

      at: Test.<anonymous> (/home/travis/build/pelias/interpolation/test/functional/cemetery_rd/run.js:107:7)

If we can help fix this with some tasteful rounding, that would be great.

Docker Container fails to start

I'm running some load testing on the pelias api docker and encountering these errors:

root@pelias-api:/pelias/dockerfiles# ./run_services.sh
Stopping pelias_api ... done
Stopping pelias_libpostal ... done
Stopping pelias_pip-service ... done
Stopping pelias_interpolation ... done
Removing pelias_libpostal_baseimage ... done
Removing pelias_libpostal ... done
Removing pelias_pip-service ... done
Removing pelias_interpolation ... done
Removing network dockerfiles_default
Removing network dockerfiles_pelias
ERROR: network dockerfiles_pelias has active endpoints
Creating network "dockerfiles_default" with the default driver
Creating pelias_interpolation

ERROR: for interpolation  UnixHTTPConnectionPool(host='localhost', port=None): Read timed out. (read timeout=60)
Traceback (most recent call last):
  File "/usr/bin/docker-compose", line 9, in <module>
    load_entry_point('docker-compose==1.8.0', 'console_scripts', 'docker-compose')()
  File "/usr/lib/python2.7/dist-packages/compose/cli/main.py", line 61, in main
    command()
  File "/usr/lib/python2.7/dist-packages/compose/cli/main.py", line 113, in perform_command
    handler(command, command_options)
  File "/usr/lib/python2.7/contextlib.py", line 35, in __exit__
    self.gen.throw(type, value, traceback)
  File "/usr/lib/python2.7/dist-packages/compose/cli/errors.py", line 56, in handle_connection_errors
    log_timeout_error()
TypeError: log_timeout_error() takes exactly 1 argument (0 given)

This happens after the load testing, which usually causes even docker ps to hang...that requires me to restart the docker service.

Rebuild DB error

Hi there,

I'm trying to rebuild the database for interpolation and it's giving me this:

[root@pelias dockerfiles]# docker-compose run interpolation bash ./docker_build.sh
Starting pelias_libpostal_baseimage ... done

> [email protected] build /code/pelias/interpolation
> ./script/build.sh

- importing polylines

npm ERR! Linux 3.10.0-229.1.2.el7.x86_64
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "run" "build"
npm ERR! node v6.13.0
npm ERR! npm  v3.10.10
npm ERR! code ELIFECYCLE
npm ERR! [email protected] build: `./script/build.sh`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] build script './script/build.sh'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npm ERR! If you do, this is most likely a problem with the pelias-interpolation package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR!     ./script/build.sh
npm ERR! You can get information on how to open an issue for this project with:
npm ERR!     npm bugs pelias-interpolation
npm ERR! Or if that isn't available, you can get their info via:
npm ERR!     npm owner ls pelias-interpolation
npm ERR! There is likely additional logging output above.

npm ERR! Please include the following file with any support request:
npm ERR!     /code/pelias/interpolation/npm-debug.log

Also if i try running

[root@pelias dockerfiles]# docker-compose run interpolation bash ./interpolate osm "/opt/pelias/interpolation/address.db" "/opt/pelias/interpolation/street.db" < /opt/pbf2json/osm_data.json
Starting pelias_libpostal_baseimage ... done
events.js:160
      throw er; // Unhandled 'error' event
      ^

Error: SQLITE_CANTOPEN: unable to open database file
    at Error (native)
read unix @->/var/run/docker.sock: read: connection reset by peer

Can you help?

Thank you,
Bogdan

An in-range update of tape is breaking the build 🚨

Version 4.9.0 of tape was just published.

Branch Build failing 🚨
Dependency tape
Current Version 4.8.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

tape is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • ci/circleci Your tests passed on CircleCI! Details
  • continuous-integration/travis-ci/push The Travis CI build failed Details

Commits

The new version differs by 27 commits.

  • ea6d91e v4.9.0
  • 6867840 [Deps] update object-inspect, resolve
  • 4919e40 [Tests] on node v9; use nvm install-latest-npm
  • f26375c Merge pull request #420 from inadarei/global-depth-env-var
  • 17276d7 [New] use process.env.NODE_TAPE_OBJECT_PRINT_DEPTH for the default object print depth.
  • 0e870c6 Merge pull request #408 from johnhenry/feature/on-failure
  • 00aa133 Add "onFinish" listener to test harness.
  • 0e68b2d [Dev Deps] update js-yaml
  • 10b7dcd [Fix] fix stack where actual is falsy
  • 13173a5 Merge pull request #402 from nhamer/stack_strip
  • f90e487 normalize path separators in stacks
  • b66f8f8 [Deps] update function-bind
  • cc69501 Merge pull request #387 from fongandrew/master
  • bf5a750 Handle spaces in path name for setting file, line no
  • 3c2087a Test name with spaces

There are 27 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Latest docker image [v0.3.4] is not compatible with Pelias Dockerfiles script

The command docker-compose run --rm interpolation bash ./docker_build.sh & of https://github.com/pelias/dockerfiles/blob/master/prep_data.sh only returns below message, instead of running.

Usage: interpolate [command] [options]
  Note: you will need to pipe data in to the import/conflate commands

   help                                                                      output usage information
   search [address_db] [street_db] [lat] [lon] [house_number] [street_name]  search database for specified housenumber + street
   polyline [street_db]                                                      import polyline data in to [street_db]
   oa [address_db] [street_db]                                               conflate oa csv file in to [address_db] using [street_db]
   osm [address_db] [street_db]                                              conflate osm file in to [address_db] using [street_db]
   tiger [address_db] [street_db]                                            conflate tiger address range geojson file in to [address_db] using [street_db]
   vertices [address_db] [street_db]                                         compute fractional house numbers for line vertices
   extract [address_db] [street_db] [lat] [lon] [street_name]                extract street address data for debugging purposes
   server [address_db] [street_db]                                           start a web server

   build 

So, in order to fix it, I had to revert the Dockerfile of this repository, from

# entrypoint
ENTRYPOINT [ "./interpolate" ]
CMD [ "server", "/data/interpolation/address.db", "/data/interpolation/street.db" ]

to

CMD [ "./interpolate","server", "/data/interpolation/address.db", "/data/interpolation/street.db" ]

then build the image locally. After that, the command works nicely.

This is just a workaround, there may be a better way.

where the streets have no name

some address data contain the housenumber but no street name, eg:
http://www.openstreetmap.org/way/182944167#map=19/51.46270/-0.16260

this can be fairly common in OSM because copy->pasting the street name for each building is tedious.

MattA:

originally, the intention (to avoid people having to repeatedly type / copy-paste the street name) was that the nearest street could be used by default

the wiki page (http://wiki.openstreetmap.org/wiki/Proposed_features/House_numbers/Karlsruhe_Schema#Tags) says " If not given a program may assume the name of the nearest street it can find, but this is not easy or fast to do in all cases (especially at intersections), so putting the name in here is strongly encouraged (more reliable)."

which is a nice way of saying that not including the street name was a Bad Idea
and yet, the data is there

... these addresses are not currently imported in to Pelias but they could potentially be imported in to the interpolation database.

guessing the correct street at intersections will be difficult-to-impossible and selecting the wrong one could throw off interpolation ranges for that street.

it would be a good idea to adopt a more conservative approach to matching; for example, if two projections are similar distances or there is no projection within a set distance then we abort and ignore that record; for less problematic cases (such as linked above) we can probably guess the street without too much issue.

Alert "Database successfully built" or throw error

There is no indication of whether address.db and street.db are built when ./interpolate polyline street.db < /data/new_zealand.polylines runs. If the database is empty, a warning or error saying nothing was imported would be helpful.

An in-range update of semantic-release is breaking the build 🚨

Version 6.3.6 of semantic-release just got published.

Branch Build failing 🚨
Dependency semantic-release
Current Version 6.3.5
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

As semantic-release is “only” a devDependency of this project it might not break production or downstream projects, but “only” your build or test tools – preventing new deploys or publishes.

I recommend you give this issue a high priority. I’m sure you can resolve this 💪

Status Details
  • continuous-integration/travis-ci/push The Travis CI build failed Details

Release Notes v6.3.6

<a name"6.3.6">

6.3.6 (2017-01-13)

Bug Fixes

  • package:
    • update github to version 8.0.0 (afd6ab75)
    • update nopt to version 4.0.0 (b10c8910)
Commits

The new version differs by 8 commits0.

  • af90de9 Merge pull request #355 from semantic-release/dep-updates
  • aa012b3 Merge branch 'greenkeeper/tap-9.0.0' into dep-updates
  • 5724b19 Merge branch 'greenkeeper/github-8.0.0' into dep-updates
  • 97a8e73 Merge branch 'caribou' into greenkeeper/nopt-4.0.0
  • 51e12a3 docs(package): fix 'home' link
  • afd6ab7 fix(package): update github to version 8.0.0
  • f80e056 chore(package): update tap to version 9.0.0
  • b10c891 fix(package): update nopt to version 4.0.0

false

See the full diff

Not sure how things should work exactly?

There is a collection of frequently asked questions and of course you may always ask my humans.


Your Greenkeeper Bot 🌴

Use libpostal service

Similar to pelias/api#1060, we want to modify this service to call a separate HTTP service for libpostal, rather than loading the 2GB of libpostal data in memory.

This would drastically increase efficiency as we could launch multiple interpolation service instances to scale without paying the cost of loading multiple copies of libpostal.

Since a full planet interpolation dataset is around 40GB, we can take advantage of the Node.js cluster module to allow several separate Node.js processes to use the same copy of the data. This would behave similarly to how we use it with the placeholder service, as mentioned in pelias/placeholder#72.

Out of memory when building databases

Hello,

I am trying to build street.db and address.db, but I am running out of RAM on the very first step. This is what I am running:

#!/bin/bash
set -euo pipefail

export BUILDDIR=...
export POLYLINE_FILE=.../planet.polylines

./script/import.sh

The process runs for a long time as ram usage is increasing and it eventually errors out. The machine has 236GB of RAM. The exact error was something about memalloc on array push, but I don't have the exact error message now as I've restarted the process and the error log is now empty. The process has now been running for ~7 hours now and is using 87GB of RAM, steadily increasing. I am sure it will run out of RAM again.

I am using the production branch.

The polylines files was created using
pbf streets -f polyline planet-latest.osm.pbf > planet.polylines

Any ideas what may be the issue here?

Enable interpolation on autocomplete

We would like to contribute on enabling interpolation for autocomplete. This is a big step in resolving issues in the use of Pelias in our product. As discussed with @orangejulius on Gitter, I open this issue to discuss how we could help on implementing it and how it should be done.

export a list of oa lines which do not conflate to osm

export a list of openaddresses lines which do not match a road segment from openstreetmap.

when I ran an import for new zealand, the percentage of records which matches was ~80%, it would be interesting to have a list of the addresses which did not match.

this is helpful for:

  • improving the matching algorithm
  • awareness of osm coverage vs. oa
  • etc.

query for gleimstrasse fails to match

query for gleimstrasse, berlin fails to match despite having rows in the db (matching logic bug?)

/extract/geojson?lat=52.546892&lon=13.404291&names=gleim%20strasse
select * from polyline where id = 20012388;
20012388|{qffcBizmqXTdEVlG~H|~BtJrqB\tFpB`d@|Jj}BdEv_AjA~V?jJN|GO}G?kJkA}VeEy_A}Jk}BqBad@]uFuJsqB_I{~BWoGUcEkQexEjQbxE
sqlite> select * from names where id = 20012388;
29100033|20012388|gleimstraße
29100034|20012388|gleim straße
select * from address where id = 20012388;
rowid|id|source|source_id|housenumber|lat|lon|parity|proj_lat|proj_lon
123117441|20012388|OA|de/berlin:0bda00204b2ddc9d|10.0|52.5468603|13.4058849|L|52.5470281270828|13.405838626042
123117442|20012388|OA|de/berlin:680583fb8599aff3|10.03|52.5468775|13.4060738|L|52.5470473315231|13.406026973333
123117443|20012388|OA|de/berlin:e5a1d3437fc3d936|11.0|52.5468984|13.4063029|L|52.5470706239638|13.4062554136339
123117444|20012388|OA|de/berlin:e62db3492191931e|12.0|52.5469252|13.4065954|L|52.5471003654175|13.4065471025441
123117445|20012388|OA|de/berlin:7e7e2a6a0c032fea|13.0|52.5469531|13.4068882|L|52.5471301667027|13.4068393782504
123117446|20012388|OA|de/berlin:68d3c3204fff2cb7|13.03|52.5469712|13.4070986|L|52.5471515280099|13.4070488789849
123117447|20012388|OA|de/berlin:fca1f8a028ee8a86|14.0|52.5470432|13.4078884|L|52.5472233841668|13.407850314934
123117448|20012388|OA|de/berlin:de161640924a8758|15.0|52.5470761|13.4082487|L|52.547251623353|13.4082116000227
123117449|20012388|OA|de/berlin:796a91b8a99d8615|16.0|52.5470987|13.4084969|L|52.5472710754548|13.4084604653493
123117450|20012388|OA|de/berlin:7ebe440509a64eba|17.0|52.5471151|13.4086763|L|52.5472851365658|13.4086403596887
123117451|20012388|OA|de/berlin:8a066b1c37e40eb4|18.0|52.5471384|13.408932|L|52.5473051767406|13.4088967486745
123117452|20012388|OA|de/berlin:2e7e869c9ea9d604|19.0|52.5471699|13.4092775|L|52.5473322551603|13.4092431832073
123117453|20012388|OA|de/berlin:5d42dcce4597d883|20.03|52.547238|13.4100145|L|52.5473961147333|13.4099783149099
123117454|20012388|OA|de/berlin:15173731ebf23605|20.06|52.5472595|13.4102506|L|52.547416124423|13.4102147559374
123117455|20012388|OA|de/berlin:a90eed0d6ac49c80|21.0|52.5472847|13.410528|L|52.5474396331682|13.4104925429464
123117456|20012388|OA|de/berlin:1f1a870f2ab282b9|22.0|52.5473099|13.4108036|L|52.547462992476|13.4107685641557
123117457|20012388|OA|de/berlin:8954f81f720ea09b|23.0|52.5473305|13.41103|L|52.547482179765|13.4109952874278
123117458|20012388|OA|de/berlin:5420f910277ccb3d|24.0|52.5473526|13.4112723|L|52.5475027155834|13.411237945363
123117459|20012388|OA|de/berlin:0817a7b4f06605f1|25.0|52.5473712|13.411477|L|52.5475200633231|13.4114429319196
123117460|20012388|OA|de/berlin:8ae7c91c3ea8364c|26.0|52.5474003|13.4117959|L|52.5475470915281|13.4117623060152
123117461|20012388|OA|de/berlin:a2e5312204333ab8|27.0|52.5474133|13.4119383|L|52.5475591606774|13.4119049190253
123117462|20012388|OA|de/berlin:1e9dbbdf67d44ae6|28.0|52.5474404|13.4122356|L|52.5475843576352|13.4122026545057
123117463|20012388|OA|de/berlin:b23c8a12edc048f0|29.0|52.5474825|13.4126975|L|52.5476235048024|13.4126652302158
123117464|20012388|OA|de/berlin:7526204dc4742a1a|30.0|52.5477687|13.4125195|L|52.5476141792412|13.4125548532755
123117465|20012388|OA|de/berlin:5371b90c813a6e3c|31.0|52.5477325|13.4121217|L|52.5475804753228|13.412156482132
123117466|20012388|OA|de/berlin:f6faf01adb0c845c|35.0|52.5476614|13.4113397|L|52.5475142209421|13.4113733733807
123117467|20012388|OA|de/berlin:b5e182ae49664ca5|36.0|52.5476269|13.410969|L|52.5474827985451|13.4110019691982
123117468|20012388|OA|de/berlin:fdd662b10f4394b6|37.0|52.5476119|13.4108104|L|52.5474693502624|13.4108430141561
123117469|20012388|OA|de/berlin:0b5f18d9ae1df159|38.0|52.5475791|13.4104606|L|52.5474396948004|13.4104924946651
123117470|20012388|OA|de/berlin:41f67cb896679b4d|39.0|52.547552|13.4101594|L|52.5474141812682|13.4101909316565
123117471|20012388|OA|de/berlin:737f96db51c5dbe2|40.0|52.5474778|13.4093685|L|52.5473443313989|13.4093967252638
123117472|20012388|OA|de/berlin:1d950c35905643d5|41.0|52.5474517|13.4090839|L|52.5473220127523|13.4091113255697
123117473|20012388|OA|de/berlin:da7d0a9cafd6b07d|42.0|52.5474276|13.4088209|L|52.5473013883126|13.4088475905474
123117474|20012388|OA|de/berlin:f1079ab047a4a35a|43.0|52.5474072|13.4085993|L|52.5472840089336|13.408625351739
123117475|20012388|OA|de/berlin:ca41d77ee6b0cfa2|44.0|52.5473774|13.4082746|L|52.5472585452347|13.4082997346888
123117476|20012388|OA|de/berlin:50dd83d8210456db|45.03|52.5473408|13.4078766|L|52.5472273320029|13.4079005954875
123117477|20012388|OA|de/berlin:bd7d0d71d00d73be|45.0|52.5473606|13.4080924|L|52.5472442554314|13.4081170038287
123117478|20012388|OA|de/berlin:16fc769a643627fb|46.0|52.5473606|13.4064217|R|52.547095044895|13.4064949215904
123117479|20012388|OA|de/berlin:a773fd9b543eb1ec|47.0|52.5475746|13.4063732|R|52.5470960869803|13.4065051418281
123117480|20012388|OA|de/berlin:0783bfed836e6e04|49.0|52.5472033|13.4062532|R|52.5470740324905|13.4062888427145
123117481|20012388|OA|de/berlin:80809d2cef8fce1c|50.0|52.5470683|13.4048499|R|52.5469318816115|13.4048847466934
123117482|20012388|OA|de/berlin:3e93ea2aa7f198cc|51.0|52.5470392|13.4045536|R|52.5469038666526|13.4045881694848
123117483|20012388|OA|de/berlin:9391eb5616341a25|52.0|52.5470079|13.4042162|R|52.5468720089862|13.4042509118853
123117484|20012388|OA|de/berlin:4607555f7b35f74b|53.0|52.546988|13.4040011|R|52.5468517002985|13.4040359162486
123117485|20012388|OA|de/berlin:fbabe666b2fd6318|54.0|52.5469595|13.4036934|R|52.5468226479927|13.4037283572836
123117486|20012388|OA|de/berlin:b3f442c2315dd226|55.0|52.5469374|13.4034554|R|52.5468001752812|13.4034904524532
123117487|20012388|OA|de/berlin:8a9c64e8dd14c1cd|56.0|52.5469076|13.4031338|R|52.5467698102747|13.4031689967297
123117488|20012388|OA|de/berlin:e6747f4e8a51c3f8|57.0|52.5468876|13.4029178|R|52.5467494162189|13.4029530973544
123117489|20012388|OA|de/berlin:f89d04e672cdd6db|58.0|52.5468578|13.4025969|L|52.546718923323|13.4026327523834
123117490|20012388|OA|de/berlin:de9bbc7be7562ff4|59.0|52.546839|13.4023939|L|52.5466995573274|13.4024298984699
123117491|20012388|OA|de/berlin:84c29efc207d7ee2|60.0|52.5468077|13.4020562|L|52.5466673404784|13.4020924351116
123117492|20012388|OA|de/berlin:d221920e03fd0d9c|61.0|52.5467835|13.4017948|L|52.5466424033853|13.4018312253589
166387367|20012388|OSM|node:733569766|48.0|52.5471756|13.4063|R|52.5470779163997|13.406326934102
174587465|20012388|OSM|node:4314333203|32.0|52.5477143|13.4120467|L|52.5475739048852|13.4120788213468
373093837|20012388|VERTEX||19.814||||52.547363|13.409618
373093838|20012388|VERTEX||19.52||||52.547351|13.409483
373093839|20012388|VERTEX||13.548||||52.547004|13.405602
373093840|20012388|VERTEX||15.403||||52.546989|13.405479
373093841|20012388|VERTEX||24.28||||52.546932|13.404886
373093842|20012388|VERTEX||54.539||||52.546741|13.402864
373093843|20012388|VERTEX||60.993||||52.546642|13.401828
373093844|20012388|VERTEX||60.174||||52.546604|13.401444
373093845|20012388|VERTEX||59.79||||52.546604|13.401262
373093846|20012388|VERTEX||59.488||||52.546596|13.401119
373093847|20012388|VERTEX||59.185||||52.546604|13.401262
373093848|20012388|VERTEX||58.802||||52.546604|13.401444
373093849|20012388|VERTEX||57.984||||52.546642|13.401827
373093850|20012388|VERTEX||55.773||||52.546741|13.402864
373093851|20012388|VERTEX||51.463||||52.546932|13.404886
373093852|20012388|VERTEX||50.198||||52.546989|13.405479
373093853|20012388|VERTEX||49.934||||52.547004|13.405602
373093854|20012388|VERTEX||46.017||||52.547191|13.407436

Mirror TIGER data

TIGER data for street address ranges in the United States is currently downloaded direct from the US Census FTP server. It's quite slow.

As a government work, TIGER data should be public domain. Therefore we should be able to mirror it to S3 or other fast location. It might already be on there.

An in-range update of csv-parse is breaking the build 🚨

Version 1.3.0 of csv-parse was just published.

Branch Build failing 🚨
Dependency csv-parse
Current Version 1.2.4
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

csv-parse is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • ci/circleci Your tests passed on CircleCI! Details
  • continuous-integration/travis-ci/push The Travis CI build failed Details

Commits

The new version differs by 6 commits.

  • d11de9d package: bump to version 1.3.0
  • 61762a5 test: should require handled by mocha
  • b0fe635 package: coffeescript 2 and use semver tilde
  • b347b69 Allow auto_parse to be a function and override default parsing
  • 8359816 Allow user to pass in custom date parsing functionn
  • f87c273 options: ensure objectMode is cloned

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

executable file not found in $PATH after running docker

I tried to create street.db from polyline data using docker but received this error message.

$ cat /data/brandenburg.polylines | docker run -i -v /data:/data pelias/interpolation polyline /data/street.db

container_linux.go:247: starting container process caused "exec: \"polyline\": executable file not found in $PATH"
docker: Error response from daemon: oci runtime error: container_linux.go:247: starting container process caused "exec: \"polyline\": executable file not found in $PATH".
ERRO[0000] error getting events from daemon: net/http: request canceled 

Do you have a hint how to fix this? Are there any parameters or configurations that need to be passed? Or is it maybe a docker issue?

Running scripts other than server in the docker container doesn't work in my setting

After executing cat /data/berlin.polylines | docker run -i -v /data:/data pelias/interpolation polyline /data/street.db like it is explained in the readme I reveice this response:

container_linux.go:247: starting container process caused "exec: \"polyline\": executable file not found in $PATH"
docker: Error response from daemon: oci runtime error: container_linux.go:247: starting container process caused "exec: \"polyline\": executable file not found in $PATH".
ERRO[0000] error getting events from daemon: net/http: request canceled 

I don't know if it's maybe a docker issue. Have you seen this before and do you have a hint how to solve this?

An in-range update of jsftp is breaking the build 🚨

Version 2.1.3 of jsftp was just published.

Branch Build failing 🚨
Dependency jsftp
Current Version 2.1.2
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

jsftp is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • ci/circleci Your tests passed on CircleCI! Details
  • continuous-integration/travis-ci/push The Travis CI build could not complete due to an error Details

Commits

The new version differs by 5 commits.

  • 93c52bf Working docker testing set-up with pureftpd
  • 8dd2a00 Merge pull request #268 from hans-lizihan/master
  • e835887 ADD: readme and tests for createSocket option
  • 6655123 UPT: readme about the createSocket option
  • ad4a130 ADD: support createConnection option to support proxy

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

intestigate: vertex: address ordering

the stream/vertices/lookup.js file contains an SQL query:

SELECT * FROM address WHERE source != "VERTEX" AND id = ? ORDER BY housenumber ASC

when the ORDER BY housenumber ASC condition is removed; the functional tests fail due to slightly different database counts.

this should not be the case as the ordering of those rows should not effect anything, the stream/vertices/augment.js script *should be re ordering the points by distance along the line.

investigate - potsdamer platz

searching for 4 potsdamer platz (mitte, berlin) returns an interpolated value instead of the roof-top accuracy value we have in the database (above in blue).

this is most likely due to the very unusual shape of the road segments involved.

potsdamer

Firwood Avenue, Pretoria, South Africa

This issue was originally raised long ago in pelias/pelias#347.

It appears there's a case of address interpolation data within OSM that we do not support.

See this way in OSM: https://www.openstreetmap.org/way/4232282#map=18/-25.77606/28.25725

As shown below, this road has address interpolation information stored in separate ways that are near the street.
screenshot_2018-02-18_14-53-59

The interpolation way (way 99016475), does not appear to be linked via tags to the way for the road. Nominatim however, appears to be able to pick it up.

I'm not sure how this is possible, and the address interpolation guide in the OSM wiki does not provide many clues. Does Nominatim do some sort of geometry based comparison of the different ways? This must lead to a lot of data errors.

Hopefully we can support this format as well.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.