Code Monkey home page Code Monkey logo

docker's Introduction

A modular, open-source search engine for our world.

Pelias is a geocoder powered completely by open data, available freely to everyone.

Local Installation · Cloud Webservice · Documentation · Community Chat

What is Pelias?
Pelias is a search engine for places worldwide, powered by open data. It turns addresses and place names into geographic coordinates, and turns geographic coordinates into places and addresses. With Pelias, you’re able to turn your users’ place searches into actionable geodata and transform your geodata into real places.

We think open data, open source, and open strategy win over proprietary solutions at any part of the stack and we want to ensure the services we offer are in line with that vision. We believe that an open geocoder improves over the long-term only if the community can incorporate truly representative local knowledge.

Pelias

A modular, open-source geocoder built on top of Elasticsearch for fast and accurate global search.

What's a geocoder do anyway?

Geocoding is the process of taking input text, such as an address or the name of a place, and returning a latitude/longitude location on the Earth's surface for that place.

geocode

... and a reverse geocoder, what's that?

Reverse geocoding is the opposite: returning a list of places near a given latitude/longitude point.

reverse

What are the most interesting features of Pelias?

  • Completely open-source and MIT licensed
  • A powerful data import architecture: Pelias supports many open-data projects out of the box but also works great with private data
  • Support for searching and displaying results in many languages
  • Fast and accurate autocomplete for user-facing geocoding
  • Support for many result types: addresses, venues, cities, countries, and more
  • Modular design, so you don't need to be an expert in everything to make changes
  • Easy installation with minimal external dependencies

What are the main goals of the Pelias project?

  • Provide accurate search results
  • Work equally well for a small city and the entire planet
  • Be highly configurable, so different use cases can be handled easily and efficiently
  • Provide a friendly, welcoming, helpful community that takes input from people all over the world

Where did Pelias come from?

Pelias was created in 2014 as an early project at Mapzen. After Mapzen's shutdown in 2017, Pelias is now part of the Linux Foundation.

How does it work?

Magic! (Just kidding) Like any geocoder, Pelias combines full text search techniques with knowledge of geography to quickly search over many millions of records, each representing some sort of location on Earth.

The Pelias architecture has three main components and several smaller pieces.

A diagram of the Pelias architecture.

Data importers

The importers filter, normalize, and ingest geographic datasets into the Pelias database. Currently there are six officially supported importers:

We are always discussing supporting additional datasets. Pelias users can also write their own importers, for example to import proprietary data into your own instance of Pelias.

Database

The underlying datastore that does most of the query heavy-lifting and powers our search results. We use Elasticsearch. Currently versions 7 and 8 are supported.

We've built a tool called pelias-schema that sets up Elasticsearch indices properly for Pelias.

Frontend services

This is where the actual geocoding process happens, and includes the components that users interact with when performing geocoding queries. The services are:

  • API: The API service defines the Pelias API, and talks to Elasticsearch or other services as needed to perform queries.
  • Placeholder: A service built specifically to capture the relationship between administrative areas (a catch-all term meaning anything like a city, state, country, etc). Elasticsearch does not handle relational data very well, so we built Placeholder specifically to manage this piece.
  • PIP: For reverse geocoding, it's important to be able to perform point-in-polygon(PIP) calculations quickly. The PIP service is is very good at quickly determining which admin area polygons a given point lies in.
  • Libpostal: Pelias uses the libpostal project for parsing addresses using the power of machine learning. We use a Go service built by the Who's on First team to make this happen quickly and efficiently.
  • Interpolation: This service knows all about addresses and streets. With that knowledge, it is able to supplement the known addresses that are stored directly in Elasticsearch and return fairly accurate estimated address results for many more queries than would otherwise be possible.

Dependencies

These are software projects that are not used directly but are used by other components of Pelias.

There are lots of these, but here are some important ones:

  • model: provide a single library for creating documents that fit the Pelias Elasticsearch schema. This is a core component of our flexible importer architecture
  • wof-admin-lookup: A library for performing administrative lookup using point-in-polygon math. Previously included in each of the importers but now only used by the PIP service.
  • query: This is where most of our actual Elasticsearch query generation happens.
  • config: Pelias is very configurable, and all of it is driven from a single JSON file which we call pelias.json. This package provides a library for reading, validating, and working with this configuration. It is used by almost every other Pelias component
  • dbclient: A Node.js stream library for quickly and efficiently importing records into Elasticsearch

Helpful tools

Finally, while not part of Pelias proper, we have built several useful tools for working with and testing Pelias

Notable examples include:

  • acceptance-tests: A Node.js command line tool for testing a full planet build of Pelias and ensuring everything works. Familiarity with this tool is very important for ensuring Pelias is working. It supports all Pelias features and has special facilities for testing autocomplete queries.
  • compare: A web-based tool for comparing different instances of Pelias (for example a production and staging environment). We have a reference instance at pelias.github.io/compare/
  • dashboard: Another web-based tool for providing statistics about the contents of a Pelias Elasticsearch index such as import speed, number of total records, and a breakdown of records of various types.

Documentation

The main documentation lives in the pelias/documentation repository.

Additionally, the README file in each of the component repositories listed above provides more detail on that piece.

Here's an example API response for a reverse geocoding query
$ curl -s "search.mapzen.com/v1/reverse?size=1&point.lat=40.74358294846026&point.lon=-73.99047374725342&api_key={YOUR_API_KEY}" | json
{
    "geocoding": {
        "attribution": "https://search.mapzen.com/v1/attribution",
        "engine": {
            "author": "Mapzen",
            "name": "Pelias",
            "version": "1.0"
        },
        "query": {
            "boundary.circle.lat": 40.74358294846026,
            "boundary.circle.lon": -73.99047374725342,
            "boundary.circle.radius": 500,
            "point.lat": 40.74358294846026,
            "point.lon": -73.99047374725342,
            "private": false,
            "querySize": 1,
            "size": 1
        },
        "timestamp": 1460736907438,
        "version": "0.1"
    },
    "type": "FeatureCollection",
    "features": [
        {
            "geometry": {
                "coordinates": [
                    -73.99051,
                    40.74361
                ],
                "type": "Point"
            },
            "properties": {
                "borough": "Manhattan",
                "borough_gid": "whosonfirst:borough:421205771",
                "confidence": 0.9,
                "country": "United States",
                "country_a": "USA",
                "country_gid": "whosonfirst:country:85633793",
                "county": "New York County",
                "county_gid": "whosonfirst:county:102081863",
                "distance": 0.004,
                "gid": "geonames:venue:9851011",
                "id": "9851011",
                "label": "Arlington, Manhattan, NY, USA",
                "layer": "venue",
                "locality": "New York",
                "locality_gid": "whosonfirst:locality:85977539",
                "name": "Arlington",
                "neighbourhood": "Flatiron District",
                "neighbourhood_gid": "whosonfirst:neighbourhood:85869245",
                "region": "New York",
                "region_a": "NY",
                "region_gid": "whosonfirst:region:85688543",
                "source": "geonames"
            },
            "type": "Feature"
        }
    ],
    "bbox": [
        -73.99051,
        40.74361,
        -73.99051,
        40.74361
    ]
}

How can I install my own instance of Pelias?

To try out Pelias quickly, use our Docker setup. It uses Docker and docker-compose to allow you to quickly set up a Pelias instance for a small area (by default Portland, Oregon) in under 30 minutes.

Do you offer a free geocoding API?

You can sign up for a trial API key at Geocode Earth. A commercial service has been operated by the core development team behind Pelias since 2014 (previously at search.mapzen.com). Discounts and free plans are available for free and open-source software projects.

What's it built with?

Pelias itself (the import pipelines and API) is written in Node.js, which makes it highly accessible for other developers and performant under heavy I/O. It aims to be modular and is distributed across a number of Node packages, each with its own repository under the Pelias GitHub organization.

For a select few components that have performance requirements that Node.js cannot meet, we prefer to write things in Go. A good example of this is the pbf2json tool that quickly converts OSM PBF files to JSON for our OSM importer.

Elasticsearch is our datastore of choice because of its unparalleled full text search functionality, scalability, and sufficiently robust geospatial support.

Contributing

Gitter

We built Pelias as an open source project not just because we believe that users should be able to view and play with the source code of tools they use, but to get the community involved in the project itself.

Especially with a geocoder with global coverage, it's just not possible for a small team to do it alone. We need you.

Anything that we can do to make contributing easier, we want to know about. Feel free to reach out to us via Github, Gitter, email, or Twitter. We'd love to help people get started working on Pelias, especially if you're new to open source or programming in general.

We have a list of Good First Issues for new contributors.

Both this meta-repo and the API service repo are worth looking at, as they're where most issues live. We also welcome reporting issues or suggesting improvements to our documentation.

The current Pelias team can be found on Github as missinglink and orangejulius.

Members emeritus include:

docker's People

Contributors

aminato avatar androidseb avatar blackmad avatar borisperlov avatar dmklinger avatar dr0i avatar hunterowens avatar ingenieroariel avatar irbian avatar jeremy-rutman avatar jimmylevell avatar joxit avatar kilick avatar ledfan avatar marsxul avatar michaelkirk avatar missinglink avatar orangejulius avatar russeree avatar ryanbateman avatar seankilleen avatar sergiuszkierat avatar severo avatar thiagomiranda3 avatar thomasg77 avatar tomwassing avatar torbjokv avatar tsangares avatar welrachid avatar wiseman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

docker's Issues

pelias import all hits missing datafile and crashes

after a day of wrangling i thought i was finally home free - all steps in the docker example up till and including download all, prepare all had run successfully, but now:

jeremy@jeremy$ pelias import  all
info: [whosonfirst] Loading whosonfirst-data-ocean-latest.csv records from /data/whosonfirst/meta
events.js:167
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/whosonfirst-data-ocean-latest.csv'
Emitted 'error' event at:
    at lazyFs.open (internal/fs/streams.js:115:12)
    at FSReqWrap.oncomplete (fs.js:141:20)

should a missing data file really trigger a crash? anyway perhaps the problem is what appears to be an incorrect absolute path? data is a subdir of my working dir, so /data doesn't exist except maybe in the docker filesystem.

My DATA_DIR is /home/jeremy/pelias_dock/docker/projects/portland-metro/data , the dir exists and is writable. Maybe the use of /data got hardcoded somewhere? I see its used in the shell script example...

Put a limit on duration of `pelias elastic wait` command

Currently, pelias elastic wait will run forever waiting for an Elasticsearch instance to come up. If there's a problem, such as in #33, it will prevent automated scripts from continuing, or failing. In the case of a user running pelias but having an issue, it might give them a false sense that things are okay.

We should set a reasonable limit, it could probably be as low as 60 seconds, for how long pelias elastic wait will wait, after which it should error. This is more than enough time for a functioning Elasticsearch instance to come up, even if it has lots of data.

Elasticsearch 5 won't start if data directory is owned by root, or not created

The ES5 Docker images run Elasticsearch as a non-root user. This is a good thing, however it means it's difficult to ensure proper permissions for the data directory.

All our other containers (the importers) mount the data directory itself, and then create a subdirectory within it which ensures proper permissions as long as they can write to the data directory, which must already exist. However, the Elasticsearch container mounts $DATA_DIR/elasticsearch. If this directory doesn't exist, Docker will create it, but owned by root. This means Elasticsearch can't write to it, and will fail to start.

Some ideas for solutions:

  • put a mkdir -p $DATA_DIR/elasticsearch inside the pelias script as part of pelias elastic start. This would help ensure non-root permissions are set on the elasticsearch data dir
  • Add mkdir -p $DATA_DIR/elasticsearch to our setup documentation.
  • Mount the root data dir as other containers, modify our Elasticsearch Docker image to run a setup script that creates needed directories as a non-root user, and modify elasticsearch.yml to look for data in the right place

Of those, I prefer the first: it's pretty simple, and requires no action on the part of our users. However, adding more required functionality into the pelias script is not ideal, since we'd like to keep it as thin of a wrapper as possible.

pelias compose up - crash if dir exists

Not sure if this is intentional but if pelias/projects/planet/test_cases already exists , it triggers a crash

+ pelias compose up                                                                                                                                                                                                                                                     
Creating pelias_interpolation ...                                                                                                                                                                                                                                       
Creating pelias_fuzzy_tester  ...                                                                                                                                                                                                                                       
Creating pelias_fuzzy_tester  ... error                                                                                                                                                                                                                                 
Creating pelias_libpostal     ...                                                                                                                                                                                                                                       
Creating pelias_interpolation ... done                                                                                                                                                                                                                                  
Creating pelias_openaddresses ...                                                                                                                                                                                                                                       
Creating pelias_libpostal     ... done                                                                                                                                                                                                                                  
Creating pelias_pip-service   ... done                                                                                                                                                                                                                                  
Creating pelias_openaddresses ... done                                                                                                                                                                                                                                  
Creating pelias_openstreetmap ... done                                                                                                                                                                                                                                  
Creating pelias_whosonfirst   ... done                                                                                                                                                                                                                                  
Creating pelias_polylines     ... done                                                                                                                                                                                                                                  
Creating pelias_placeholder   ... done                                                                                                                                                                                                                                  
Creating pelias_transit       ... done                                                                                                                                                                                                                                  
Creating pelias_schema        ... done                                                                                                                                                                                                                                  
Creating pelias_geonames      ... done                                                                                                                                                                                                                                  
Creating pelias_api           ... done                                                                                                                                                                                                                                  
Creating pelias_csv_importer  ... done                                                                                                                                                                                                                                  
                                                                                                                                                                                                                                                                        
ERROR: for fuzzy-tester  Cannot start service fuzzy-tester: error while creating mount source path '/home/ubuntu/pelias/projects/planet/test_cases': mkdir /home/ubuntu/pelias/projects/planet/test_cases: file exists                                                  
ERROR: Encountered errors while bringing up the project.                                                                                                                                                                                                                
+ pelias test run                                                                                                                                                                                                                                                       
ERROR: An HTTP request took too long to complete. Retry with --verbose to obtain debug information.                                                                                                                                                                     
If you encounter this issue regularly because of slow network conditions, consider setting COMPOSE_HTTP_TIMEOUT to a higher value (current value: 60).                                                                                                                  
ubuntu@ip-aws:~/pelias/projects/planet$ more doit.sh                                                                                                                                                                                                           

OSM import stopps without error message

I was able to run the import of a small area (portland-metro) without any problems but when I try to perform a full planet import, the osm importer stopps after ~100mio. documents without any error message. Right now I do not run the importers in parallel, only the osm importer is running. I use the default configuration of pelias / docker-compose (starting with the configuration using Elasticsearch 2.4 and also tried with the updated configuration using Elasticsearch 5.6). To speed up debugging I disabled admin lookup.

VM Specs:
16 cores
32 GB RAM
1TB Elasticsearch storage (>1,5TB for tmp storage)

Elasticsearch status:
100148808 docs
46379422702 bytes store size

OSM importer log tail:

[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385600 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385601 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385602 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385603 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385604 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385605 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385606 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385613 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385614 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385615 no ways found
2019-05-28T17:26:05.323Z - verbose: [openstreetmap] [address_extractor] duplicating a venue with address
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385635 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385636 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385637 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385638 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385643 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385657 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385658 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385659 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385660 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385702 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385703 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385704 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385705 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385706 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385707 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385741 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385767 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385768 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385769 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385783 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385793 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385798 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385802 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385932 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385933 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385934 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385935 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385936 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385937 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385938 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385983 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9385984 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386021 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386456 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386573 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386574 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386575 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386576 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386577 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386578 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386579 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386580 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386581 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386582 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386583 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386584 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386585 no ways found
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9386586 no ways found
2019-05-28T17:26:05.502Z - verbose: [openstreetmap] [address_extractor] duplicating a venue with address
2019-05-28T17:26:05.508Z - verbose: [openstreetmap] [address_extractor] duplicating a venue with address
2019-05-28T17:26:05.539Z - verbose: [openstreetmap] [address_extractor] duplicating a venue with address
[pbf2json]: 2019/05/28 17:26:05 denormalize failed for relation: 9387007 no ways found
2019-05-28T17:26:05.582Z - verbose: [openstreetmap] [address_extractor] duplicating a venue with address
2019-05-28T17:26:05.720Z - info: [dbclient-openstreetmap]  paused=false, transient=0, current_length=0, indexed=100148812, batch_ok=200298, batch_retries=0, failed_records=0, venue=23022762, address=77126050, persec=131.2
2019-05-28T17:26:05.720Z - info: [dbclient-openstreetmap]  paused=false, transient=0, current_length=0, indexed=100148812, batch_ok=200298, batch_retries=0, failed_records=0, venue=23022762, address=77126050, persec=131.2

Elasticsearch log tail:

[2019-05-28T12:40:51,269][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [li]
[2019-05-28T12:40:51,302][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T12:42:41,001][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sn]
[2019-05-28T12:42:41,001][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sn]
[2019-05-28T12:42:41,031][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T12:46:03,957][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ik]
[2019-05-28T12:46:03,957][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ik]
[2019-05-28T12:46:03,989][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T12:59:01,399][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [av]
[2019-05-28T12:59:01,399][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [av]
[2019-05-28T12:59:01,434][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:04:39,624][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [st]
[2019-05-28T13:04:39,624][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [st]
[2019-05-28T13:04:39,657][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:07:25,990][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sd]
[2019-05-28T13:07:25,991][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sd]
[2019-05-28T13:07:26,023][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:09:11,921][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [kl]
[2019-05-28T13:09:11,921][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [kl]
[2019-05-28T13:09:11,952][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:13:45,823][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,823][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,824][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,824][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,824][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,824][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,824][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,825][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,825][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,825][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lg]
[2019-05-28T13:13:45,859][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:15:09,396][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ha]
[2019-05-28T13:15:09,396][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ha]
[2019-05-28T13:15:09,428][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:15:13,627][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [nb]
[2019-05-28T13:15:13,627][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [nb]
[2019-05-28T13:15:13,659][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:20:38,476][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [za]
[2019-05-28T13:20:38,476][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [za]
[2019-05-28T13:20:38,519][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:23:32,250][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [io]
[2019-05-28T13:23:32,251][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [io]
[2019-05-28T13:23:32,287][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:50:43,632][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ti]
[2019-05-28T13:50:43,633][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ti]
[2019-05-28T13:50:43,665][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:53:23,977][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ie]
[2019-05-28T13:53:23,978][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ie]
[2019-05-28T13:53:24,009][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T13:56:54,389][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ig]
[2019-05-28T13:56:54,389][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ig]
[2019-05-28T13:56:54,425][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:03:43,979][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [tw]
[2019-05-28T14:03:43,980][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [tw]
[2019-05-28T14:03:44,013][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:06:03,021][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [na]
[2019-05-28T14:06:03,021][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [na]
[2019-05-28T14:06:03,065][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:17:42,682][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [dz]
[2019-05-28T14:17:42,683][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [dz]
[2019-05-28T14:17:42,717][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:23:29,061][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [rn]
[2019-05-28T14:23:29,062][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [rn]
[2019-05-28T14:23:29,095][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:27:14,742][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lu]
[2019-05-28T14:27:14,743][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [lu]
[2019-05-28T14:27:14,775][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:30:19,582][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ak]
[2019-05-28T14:30:19,583][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ak]
[2019-05-28T14:30:19,618][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:34:18,349][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ny]
[2019-05-28T14:34:18,349][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ny]
[2019-05-28T14:34:18,384][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:46:54,797][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [rw]
[2019-05-28T14:46:54,797][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [rw]
[2019-05-28T14:46:54,829][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:47:54,903][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [zu]
[2019-05-28T14:47:54,903][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [zu]
[2019-05-28T14:47:54,936][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:48:30,930][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [iu]
[2019-05-28T14:48:30,930][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [iu]
[2019-05-28T14:48:30,967][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:49:05,226][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [mh]
[2019-05-28T14:49:05,227][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [mh]
[2019-05-28T14:49:05,262][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T14:50:44,910][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [yo]
[2019-05-28T14:50:44,910][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [yo]
[2019-05-28T14:50:44,945][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T15:04:33,876][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ks]
[2019-05-28T15:04:33,877][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ks]
[2019-05-28T15:04:33,911][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T15:18:46,232][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ff]
[2019-05-28T15:18:46,232][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ff]
[2019-05-28T15:18:46,268][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T15:19:10,851][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [cu]
[2019-05-28T15:19:10,851][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [cu]
[2019-05-28T15:19:10,886][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T15:31:23,311][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [bi]
[2019-05-28T15:31:23,311][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [bi]
[2019-05-28T15:31:23,311][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [bi]
[2019-05-28T15:31:23,311][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [bi]
[2019-05-28T15:31:23,347][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T15:39:34,034][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ss]
[2019-05-28T15:39:34,034][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ss]
[2019-05-28T15:39:34,068][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T15:47:45,038][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sg]
[2019-05-28T15:47:45,038][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [sg]
[2019-05-28T15:47:45,073][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T16:26:33,825][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [wo]
[2019-05-28T16:26:33,825][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [wo]
[2019-05-28T16:26:33,860][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T16:26:34,273][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [cr]
[2019-05-28T16:26:34,273][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [cr]
[2019-05-28T16:26:34,308][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T16:45:03,454][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ts]
[2019-05-28T16:45:03,454][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ve]
[2019-05-28T16:45:03,455][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ts]
[2019-05-28T16:45:03,455][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ve]
[2019-05-28T16:45:03,491][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-28T17:22:40,632][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ki]
[2019-05-28T17:22:40,632][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [to]
[2019-05-28T17:22:40,633][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [ki]
[2019-05-28T17:22:40,633][WARN ][o.e.d.i.m.StringFieldMapper$TypeParser] The [string] field is deprecated, please use [text] or [keyword] instead on [to]
[2019-05-28T17:22:40,664][INFO ][o.e.c.m.MetaDataMappingService] [YICKzsA] [pelias/pAKPqXxxTqOYbGsW5TN0DA] update_mapping [venue]
[2019-05-29T02:28:00,003][INFO ][o.e.x.m.MlDailyMaintenanceService] triggering scheduled [ML] maintenance tasks
[2019-05-29T02:28:00,004][INFO ][o.e.x.m.a.DeleteExpiredDataAction$TransportAction] [YICKzsA] Deleting expired data

Starting pelias_elasticsearch ... error

Hello. I want to run the portland-metro sample ,and encountered the following problems:

[root@localhost ~]# git clone https://github.com/pelias/docker.git ~/pelias
[root@localhost ~]# ln -s ~/pelias/pelias /usr/local/bin/pelias
[root@localhost ~]# which pelias
/usr/local/bin/pelias
[root@localhost ~]# cd /root/pelias/projects/portland-metro
[root@localhost portland-metro]# mkdir /tmp/pelias
[root@localhost portland-metro]# cat .env
COMPOSE_PROJECT_NAME=pelias
DATA_DIR=/tmp/pelias/portland-metro
DOCKER_USER=1000
[root@localhost portland-metro]# vim .env
[root@localhost portland-metro]# cat .env
COMPOSE_PROJECT_NAME=pelias
DATA_DIR=/tmp/pelias
DOCKER_USER=1000
[root@localhost portland-metro]# pelias system env

[root@localhost portland-metro]# pelias compose pull
Pulling openstreetmap ... done
Pulling fuzzy-tester ... done
Pulling transit ... done
Pulling polylines ... done
Pulling csv-importer ... done
Pulling api ... done
Pulling elasticsearch ... done
Pulling libpostal ... done
Pulling pip ... done
Pulling openaddresses ... done
Pulling interpolation ... done
Pulling placeholder ... done
Pulling whosonfirst ... done
Pulling schema ... done
[root@localhost portland-metro]# docker images |grep pelias
pelias/openstreetmap master ef975320c34f 13 days ago 648.7 MB
pelias/elasticsearch 5.6.12 8c3bb458a312 13 days ago 755 MB
pelias/csv-importer master e92cb00a3c49 2 weeks ago 597.5 MB
pelias/pip-service master fc0560439394 2 weeks ago 598 MB
pelias/openaddresses master 6e73ee0acfce 2 weeks ago 597.4 MB
pelias/transit master ae17ed664fd4 2 weeks ago 602.2 MB
pelias/polylines master cb617b182f55 2 weeks ago 1.042 GB
pelias/whosonfirst master f8be94100a2e 2 weeks ago 552 MB
pelias/fuzzy-tester master 88fef9ebc9cf 2 weeks ago 567 MB
pelias/schema master e91ac54a49fb 3 weeks ago 505 MB
pelias/libpostal-service latest 46a0d5fdf2df 3 weeks ago 3.182 GB
pelias/api master ea8a3f25d21f 3 weeks ago 533.7 MB
pelias/placeholder master 971bc9ff7c2d 4 weeks ago 559.3 MB
pelias/interpolation master 21f4ca81a05e 3 months ago 3.311 GB

[root@localhost portland-metro]# pelias elastic start
Starting pelias_elasticsearch ... error

ERROR: for pelias_elasticsearch Cannot start service elasticsearch: OCI runtime create failed: container_linux.go:344: starting container process caused "process_linux.go:293: copying bootstrap data to pipe caused "write init-p: broken pipe"": unknown

ERROR: for elasticsearch Cannot start service elasticsearch: OCI runtime create failed: container_linux.go:344: starting container process caused "process_linux.go:293: copying bootstrap data to pipe caused "write init-p: broken pipe"": unknown
ERROR: Encountered errors while bringing up the project.

[root@localhost portland-metro]# docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
[root@localhost portland-metro]# docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1e657d238556 pelias/elasticsearch:5.6.12 "/bin/bash bin/es-do…" 24 minutes ago Created 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp pelias_elasticsearch
[root@localhost portland-metro]# docker-compose ps
Name Command State Ports

pelias_elasticsearch /bin/bash bin/es-docker Exit 128
[root@localhost portland-metro]# docker-compose logs
Attaching to pelias_elasticsearch
[root@localhost portland-metro]#

What's wrong with it? Thanks for any help.

Increase paralellism of importers

We currently run Pelias importers in series in this repository.

This keeps memory and CPU requirements low, and was initially required to avoid fatal errors if Elasticsearch became overloaded.

However, over time we've made improvements and are now pretty confident that as long as all the importers can fit into memory, an import can finish just fine even on a very CPU constrained machine.

Running importers in parallel would allow imports to finish much faster where more CPU is available, so we should consider making that at least an option, if not the default, in this project.

cannot create interpolation dir

Hi, thanks for all your great work on this. I'm running into the following issue when trying to setup a new pelias instance

$ pelias prepare interpolation
Starting pelias_libpostal_baseimage ... 
Starting pelias_libpostal_baseimage ... done

> [email protected] build /code/pelias/interpolation
> ./script/build.sh

mkdir: cannot create directory ‘//interpolation’: Permission denied
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] build: `./script/build.sh`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] build script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/pelias/.npm/_logs/2018-09-11T18_50_42_902Z-debug.log

I have write permissions in the DATA_DIR, tried running the command as root, and even tried creating an interpolation dir in / in case it was hardcoded on accident, no avail

any ideas? thanks!

`pelias elastic drop` hangs without -f

The drop_schema script from pelias/schema supports either a -f/--force-yes flag to drop the schema, or if that flag is not a CLI param, it will ask the user to type yes to confirm the schema drop.

Because we run all containers with the -T flag, we cannot actually send interactive text to a docker container once it has started.

Thus, running pelias elastic drop prints text asking for user input to confirm the schema drop, but never receives it, causing it to appear to hang forever.

Some ideas to fix this:

  • Modify our pelias binary to always pass the -f flag so the drop happens immediately
  • enable TTY allocation for this (or all) containers
  • Add support in pelias/schema to detect when a TTY is not allocated, and fail with a clear error

geonames download dir nonexistent

Most of the downloads for planet seems to work ok , but I do see (during pelias download all)

info: [geonames] downloading datafile from: http://download.geonames.org/export/dump/allCountries.zip
/bin/sh: 1: cannot create /data/geonames/allCountries.zip: Directory nonexistent
Geonames download finished with exit code 2

despite the RUN mkdir -p '/data/geonames' line in
https://github.com/pelias/geonames/blob/master/Dockerfile.

The geonames and openaddresses Dockerfiles dont have lines like that and instead seem to have lines like
fs.ensureDir(config.imports.openstreetmap.datapath, (err) in the download_data.js

Changing the datadir for geonames to /data in pelias.json allows the download to occur:

    "geonames": {
      "datapath": "/data",
      "countryCode": "ALL"
    },

BTW is there a place to put feature-requests (besides /dev/null) ? One nice-to-have is a check for what's already been downloaded , skipping things that have same modification dates / filesizes , and a report of what failed

Having WOF download custom data

Hello,

I am trying to creat a project that download only WOF data from my own URL, to make pelias geocode only my specefic geometries.

Any one succeeded on that, or even tried to ?

Thank you

pelias import all - log4j error

After

pelias elastic start
pelias elastic wait

I am getting a conitnually-restarting pelias_elasticsearch process as seen by pelias compose ps. The logs (pelias compose logs) show repeated cases of the following:

elasticsearch_1  | log4j:WARN No appenders could be found for logger (bootstrap).
elasticsearch_1  | log4j:WARN Please initialize the log4j system properly.
elasticsearch_1  | log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
elasticsearch_1  | Exception in thread "main" java.lang.IllegalStateException: Unable to access 'path.scripts' (/usr/share/elasticsearch/config/scripts)
elasticsearch_1  | Likely root cause: java.nio.file.AccessDeniedException: /usr/share/elasticsearch/config/scripts
elasticsearch_1  |      at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
elasticsearch_1  |      at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
elasticsearch_1  |      at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
elasticsearch_1  |      at sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
elasticsearch_1  |      at java.nio.file.Files.createDirectory(Files.java:674)
elasticsearch_1  |      at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
elasticsearch_1  |      at java.nio.file.Files.createDirectories(Files.java:767)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Security.ensureDirectoryExists(Security.java:337)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Security.addPath(Security.java:314)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Security.addFilePermissions(Security.java:248)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Security.createPermissions(Security.java:212)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Security.configure(Security.java:118)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Bootstrap.setupSecurity(Bootstrap.java:212)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:183)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:286)
elasticsearch_1  |      at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:45)
elasticsearch_1  | Refer to the log for complete error details.

The only slightly nonstandard thing in my pelias.json file was a portmap
elasticsearch.ports= [ "9700:9200", "9800:9300" ] , which was accompanied by setting ELASTIC_HOST=localhost:9700 in .env such that the command elastic_status refers to the correct port. Trying again after changing to the default values (pelias compose down,pelias elastic start,pelias elastic wait) didn't help matters.
Given the 'access denied' I tried also using sudo but this didnt help either - I imagine the user being denied is the pelias user running the commands in a container .

pelias download all

iiuc I need to create a /data directory on the root of my server to hold various downloads - first of all I can't seem to change owner for anything right off the root /, and second putting downloads right at the root (and this dir not being configurable) seems poor form . I see several downloads attempting to put stuff at /data despite having specified DATA_DIR=/mnt/..../data, e.g. curl -L -X GET -o /data/openaddresses/us-or-city_of_salem1Nu5o42N0ei0F.zip https://results.openaddresses.io/latest/run/us/or/city_of_salem.zip

optimization after first data download

hi,

I was trying to optimize my pelias installation by the clone this repo.
I wanted to try to figure out which data in the directory /data are cached or are data reused in future imports, I need to understand how to optimize disk space

For example, I understand that whosonfirst data is a files geojson that is subsequently loaded into the sqlite database, so after executing the command
pelias download wof
I can free up the space used by the directory /data/whosonfirst/data/*
then after imports of other datasource.. pelias will use the sqlite databases for adminLookup right?

portland-metro project does take a lot of disk space

I am trying to install the portland-metro project as recommended by the instructions:

"We recommend you start with the portland-metro example as a first-time user: it has lower disk and time requirements and can be used to help you familiarize yourself with the process.

Once you have successfully completed a small build you can use this as a base to create your own projects or move on to larger projects."

I have reached the download step, running pelias download all.
It is downloading a 28 GB file from https://dist.whosonfirst.org/sqlite/whosonfirst-data-latest.db.bz2.

Why is that? Isn't it supposed to be lightweight?

I tried skipping downloading and preparing the whosonfirst data, but I got errors later on related to whosonfirst.

Cheers.

EDIT:
Never mind I just realized the file is supposed to be 6.8GB compressed and 28GB uncompressed.
Still pretty big but not as bad.

connect ECONNREFUSED ip:4400

Hi Team,

I tried making this query : http://localhost:4000/v1/search?text=LL11 6QA

  1. With api, libpostal and elasticsearch service running : I get connect ECONNREFUSED 172.18.0.2:4400 error.

      {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"LL11 6QA","size":10,"private":false,"focus.point.lat":45.52,"focus.point.lon":-122.67,"lang":{"name":"English","iso6391":"en","iso6393":"eng","defaulted":false},"querySize":20},"errors":["connect ECONNREFUSED 172.18.0.2:4400"],"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1542550520313},"type":"FeatureCollection","features":[]}
    
  • When i kill elasticsearch service that error isn't thrown and the query is parsed by libpostal.

        {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"LL11 6QA","size":10,"private":false,"focus.point.lat":45.52,"focus.point.lon":-122.67,"lang":{"name":"English","iso6391":"en","iso6393":"eng","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"postalcode":"ll11 6qa"}},"errors":["No Living connections"],"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1542550596264},"type":"FeatureCollection","features":[]}
    
  1. After killing elasticsearch service i alter the query by removing the space in it, now the parser falls back to addressit, is this intentional ?

       {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"LL116QA","size":10,"private":false,"focus.point.lat":45.52,"focus.point.lon":-122.67,"lang":{"name":"English","iso6391":"en","iso6393":"eng","defaulted":false},"querySize":20,"parser":"addressit","parsed_text":{}},"errors":["No Living connections"],"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1542550613545},"type":"FeatureCollection","features":[]}
    

north america example includes tiger files for only a single county

I'm trying to build a pelias docker instance based on the north american example, but encountering several issues. First, the config file specifies only a single county for some reason. I'd be happy to do a PR to include every state, but if you try to download all of them using pelias download tiger, the census API will hit a rate limit and throw

Error: read ECONNRESET
  	at _errnoException (util.js:992:11)
   at TCP.onread (net.js:618:25)   

also this issue also happens repeatedly. If you try and run pelias download oa again, you get

error: [download] Failed to download data message=Command failed: unzip -o -qq -d /data/openaddresses /data/openaddresses/us-al-baldwin1gvIfbRknrmEj.zip
error:  cannot delete old /data/openaddresses/README.txt
       No such file or directory
, stack=Error: Command failed: unzip -o -qq -d /data/openaddresses /data/openaddresses/us-al-baldwin1gvIfbRknrmEj.zip
error:  cannot delete old /data/openaddresses/README.txt
       No such file or directory

obviously, you can delete the offending file download again, but it's a lot of trial and error (tiger files, on the other hand, seem to overwrite old versions just fine). Any thoughts on whether some of those errors could be warnings instead of crashing the build?

user and group in .env

.env:
COMPOSE_PROJECT_NAME=pelias
DATA_DIR=/data/pelias-docker-compose/
DOCKER_USER=1003
ENABLE_GEONAMES=true

How can I specify the groupid of my docker user?

Incorrect path to pip service

Hey team, this new docker repo is great! Thanks for the awesome work.

Small issue with the sample projects and the pip service. In the sample projects, the path is set in the config which should correspond to whatever the service was named in the docker-compose.yml file.

Unfortunately it’s pip in the yml and pip-service in the config leading to errors in the API when coarse reverse is hit. Changing config to pip fixes the issue.

Cheers! 👋

Tiger, Transit, etc. actions should do nothing when not configured

In the project for France (PR #4), we don't use transit.
However, when running pelias import all, the cli still tries to import the transit data.
This is not a big problem as it only print a error message and everything else still works fine.

Another bigger problem is when using pelias download all with no configuration for interpolation. The whole tiger dataset is downloaded. The work around is to download everything individually, but still, it would be nice to be able to do download all.

error network pelias_default is ambiguous

Hi. I am getting started w pelias and have hit something I dont know how to work with - 'network pelias_default is ambiguous' -

root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# docker run hello-world

runs fine. A first run of pelias compose pull downloaded a bunch of stuff, then:

root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias compose pull
ERROR: 5 matches found based on name: network pelias_default is ambiguous
root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias compose ps
ERROR: 5 matches found based on name: network pelias_default is ambiguous
root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias test run 
ERROR: 5 matches found based on name: network pelias_default is ambiguous
root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias system check
root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias system env
...
USERNAME=root
SUDO_COMMAND=/bin/su
USER=root
PWD=/home/jeremy/pelias_docker/docker/projects/portland-metro
COMPOSE_PROJECT_NAME=pelias
HOME=/root
SUDO_USER=jeremy
DOCKER_USER=1000
SUDO_UID=1000
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias system update
Already up to date.

any help would be appreciated.
This is with Docker version 18.06.1-ce, build e68fc7a, Ubuntu 18.04.1 LTS

elastic search not starting

Following pelias compose pull and pelias elastic start the pelias elastic wait is running into some problem -

jeremy@jeremy:$ pelias elastic start
pelias_elasticsearch is up-to-date
jeremy@jeremy:$ pelias elastic wait
waiting for elasticsearch service to come up
.hist.ory|grep pelias.
...........................
Elasticsearch did not come up, check configuration

jeremy@jeremy:$ pelias elastic status
000
jeremy@jeremy:$ pelias test run 
{ Error: getaddrinfo ENOTFOUND api api:4000
    at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:57:26)
  errno: 'ENOTFOUND',
  code: 'ENOTFOUND',
  syscall: 'getaddrinfo',
  hostname: 'api',
...

and i am not sure what configuration I am supposed to be checking ; the data dir exists and is writable .
the elastic status of 000 is (I guess) indicating no elasticsearch .

the logs show

elasticsearch_1  | [2018-11-26T14:33:11,234][INFO ][o.e.n.Node               ] [] initializing ...
elasticsearch_1  | [2018-11-26T14:33:11,263][WARN ][o.e.b.ElasticsearchUncaughtExceptionHandler] [] uncaught exception in thread [main]
elasticsearch_1  | org.elasticsearch.bootstrap.StartupException: java.lang.IllegalStateException: Failed to create node environment

...

java.nio.file.AccessDeniedException: /usr/share/elasticsearch/data/nodes    

DATA_DIR ignored when importing data for openstreetmap

I thought with #93 my problems regarding importing of data would be solved, but it isn't:
Although the DATA_DIR is set correctly when doing pelias prepare all the data is always searched for in /data/openstreetmap as if this /data directory is hardcoded somewhere.
Maybe I misunderstand the purpose of DATA_DIR completely.

This directory is not good for me because the storage has to reside on another location. Also, symbolic links don't come to the rescue because they are not followed.
What am I missing?

Prepare all script fails to open tiger shapefile

pelias prepare all
converting /data/openstreetmap/los-angeles_california.osm.pbf to /data/polylines/extract.0sv
Creating extract at /data/placeholder/wof.extract
./docker_extract.sh: line 21:    10 Killed                  pbf streets "${PBF_FILE}" >> /data/polylines/extract.0sv
                                                                                                                    Done!
import...
populate fts...
optimize...
close...
Done!
- importing polylines
- archiving street database
- conflating openaddresses
Sat Jul 6 03:06:35 UTC 2019 /data/openaddresses/us/ca/los_angeles.csv
- conflating openstreetmap
- conflating tiger
Sat Jul 6 03:24:56 UTC 2019 /data/tiger//shapefiles/tl_2016_06037_addrfeat.shx
ERROR 4: Failed to read all values for 279128 records in .shx file: Success.
FAILURE:
Unable to open datasource `/data/tiger//shapefiles/tl_2016_06037_addrfeat.shx' with the following drivers.
  -> `PCIDSK'
  -> `netCDF'
  -> `JP2OpenJPEG'
  -> `PDF'
  -> `ESRI Shapefile'
  -> `MapInfo File'
  -> `UK .NTF'
  -> `OGR_SDTS'
  -> `S57'
  -> `DGN'
  -> `OGR_VRT'
  -> `REC'
  -> `Memory'
  -> `BNA'
  -> `CSV'
  -> `NAS'
  -> `GML'
  -> `GPX'
  -> `LIBKML'
  -> `KML'
  -> `GeoJSON'
  -> `Interlis 1'
  -> `Interlis 2'
  -> `OGR_GMT'
  -> `GPKG'
  -> `SQLite'
  -> `OGR_DODS'
  -> `ODBC'
  -> `WAsP'
  -> `PGeo'
  -> `MSSQLSpatial'
  -> `OGR_OGDI'
  -> `PostgreSQL'
  -> `MySQL'
  -> `OpenFileGDB'
  -> `XPlane'
  -> `DXF'
  -> `CAD'
  -> `Geoconcept'
  -> `GeoRSS'
  -> `GPSTrackMaker'
  -> `VFK'
  -> `PGDUMP'
  -> `OSM'
  -> `GPSBabel'
  -> `SUA'
  -> `OpenAir'
  -> `OGR_PDS'
  -> `WFS'
  -> `SOSI'
  -> `HTF'
  -> `AeronavFAA'
  -> `Geomedia'
  -> `EDIGEO'
  -> `GFT'
  -> `SVG'
  -> `CouchDB'
  -> `Cloudant'
  -> `Idrisi'
  -> `ARCGEN'
  -> `SEGUKOOA'
  -> `SEGY'
  -> `XLS'
  -> `ODS'
  -> `XLSX'
  -> `ElasticSearch'
  -> `Walk'
  -> `Carto'
  -> `AmigoCloud'
  -> `SXF'
  -> `Selafin'
  -> `JML'
  -> `PLSCENES'
  -> `CSW'
  -> `VDV'
  -> `GMLAS'
  -> `TIGER'
  -> `AVCBin'
  -> `AVCE00'
  -> `HTTP'

Unsupported version in docker-compose.yml

I'm trying to run script included in here:
https://github.com/pelias/docker#generic-build-workflow

+ pelias compose pull
ERROR: Version in "./docker-compose.yml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a version of "2" (or "2.0") and place your service definitions under the `services` key, or omit the `version` key and place your service definitions at the root of the file to use version 1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/

docker-compose --version
docker-compose version 1.8.0, build unknown

elasticsearch create error

I'm trying to run the portland example project but can't get elastisearch to work: it seems to start fine but then it seems it didn't come up. Any pointers?

qmap@pelias:/opt/pelias/projects/portland-metro$ sudo pelias elastic start
Recreating pelias_elasticsearch ... 
Recreating pelias_elasticsearch ... done
qmap@pelias:/opt/pelias/projects/portland-metro$ sudo pelias elastic wait
waiting for elasticsearch service to come up
..............................
Elasticsearch did not come up, check configuration
qmap@pelias:/opt/pelias/projects/portland-metro$ sudo pelias elastic create
Elasticsearch ERROR: 2018-11-21T16:36:32Z
  Error: Request error, retrying
  GET http://elasticsearch:9200/_nodes => getaddrinfo ENOTFOUND elasticsearch elasticsearch:9200
      at Log.error (/code/pelias/schema/node_modules/elasticsearch/src/lib/log.js:226:56)
      at checkRespForFailure (/code/pelias/schema/node_modules/elasticsearch/src/lib/transport.js:259:18)
      at HttpConnector.<anonymous> (/code/pelias/schema/node_modules/elasticsearch/src/lib/connectors/http.js:163:7)
      at ClientRequest.wrapper (/code/pelias/schema/node_modules/lodash/lodash.js:4935:19)
      at ClientRequest.emit (events.js:182:13)
      at Socket.socketErrorListener (_http_client.js:391:9)
      at Socket.emit (events.js:182:13)
      at emitErrorNT (internal/streams/destroy.js:82:8)
      at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
      at process._tickCallback (internal/process/next_tick.js:63:19)

Elasticsearch WARNING: 2018-11-21T16:36:32Z
  Unable to revive connection: http://elasticsearch:9200/

Elasticsearch WARNING: 2018-11-21T16:36:32Z
  No living connections

{ Error: No Living connections
    at sendReqWithConnection (/code/pelias/schema/node_modules/elasticsearch/src/lib/transport.js:226:15)
    at next (/code/pelias/schema/node_modules/elasticsearch/src/lib/connection_pool.js:214:7)
    at process._tickCallback (internal/process/next_tick.js:61:11) message: 'No Living connections' }
please install mandatory plugins before continuing.

Support a simple web interface

It would be really handy for users of Pelias to be able to experiment with geocoding on a map when using the docker setup. Being able to actually use an autocomplete interface is especially nice.

There is a very basic webmap using nextzen/leaflet-geocoder in the South Africa project, which we should polish a tiny bit and bring into all the projects.

Some things to do:

  • test it out a bit more
  • Put it in all projects
  • Update documentation to mention it, and perhaps even print a message like "you can now view a webmap with a searchbox at http://localhost:3000" or something similar

(Hat tip to @xamanu for the request to get this going)

Switch projects off schema:portland-synonyms

The portland-synonyms branch of schema is no longer required since the code was merged to master via another branch.
We need to test it out and switch the tags back to master.

Errors and failed tests following generic build workflow with portland-metro example

First, I'll start by saying that in spite of the failed tests and copious errors, it appears that pelias is at least working (as in the example queries return results). However, I'm not sure of what to make of these errors:

There were lots of these:

error: [dbclient] invalid resp from es bulk index operation
info: [dbclient] retrying batch [500]
error: [dbclient] esclient error message=Request Timeout after 120000ms, status=408, displayName=RequestTimeout, stack=Error: Request Timeout after 120000ms
    at /code/pelias/openstreetmap/node_modules/elasticsearch/src/lib/transport.js:355:15
    at Timeout.<anonymous> (/code/pelias/openstreetmap/node_modules/elasticsearch/src/lib/transport.js:384:7)
    at ontimeout (timers.js:436:11)
    at tryOnTimeout (timers.js:300:5)
    at listOnTimeout (timers.js:263:5)
    at Timer.processTimers (timers.js:223:10)

There were thousands of these:

error: [dbclient] [429] type=es_rejected_execution_exception, reason=rejected execution of org.elasticsearch.transport.TransportService$7@50b7cdab on EsThreadPoolExecutor[bulk, queue capacity = 200, org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor@e1895bd[Running, pool size = 4, active threads = 4, queued tasks = 200, completed tasks = 2679]]

This came up only once:

warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-borough-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-localadmin-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-macrocounty-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-macroregion-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-dependency-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-empire-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-marinearea-latest.csv
warn: [wof-pip-service:master] unable to locate /data/whosonfirst/meta/whosonfirst-data-ocean-latest.csv

Running the tests, the results were:
Aggregate test results
Pass: 355
Improvements: 0
Fail: 41
Placeholders: 0
Regressions: 80
Took 12633ms
Test success rate 83.19%

Is all of this normal? Do I actually have a working geocoder for portland at least?

pelias download wof doesnt seem to run on it own

pelias download wof hits a fail on curl when imports.whosonfirst.sqlite is true

deploy@dap-jupyter01:/mnt/open_street_map/pelias_docker/docker/projects/planet$ pelias download wof
child_process.js:637
    throw err;
    ^

Error: Command failed: curl --silent -L https://dist.whosonfirst.org/sqlite/inventory.json
    at checkExecSyncError (child_process.js:616:11)
    at Object.execFileSync (child_process.js:634:13)
    at module.exports (/code/pelias/whosonfirst/node_modules/download-file-sync/index.js:3:6)
    at generateSQLites (/code/pelias/whosonfirst/utils/download_sqlite_all.js:26:32)
    at Object.download (/code/pelias/whosonfirst/utils/download_sqlite_all.js:63:29)
    at Object.<anonymous> (/code/pelias/whosonfirst/utils/download_data.js:66:38)
    at Module._compile (internal/modules/cjs/loader.js:689:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:700:10)
    at Module.load (internal/modules/cjs/loader.js:599:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:538:12)

the command curl --silent -L https://dist.whosonfirst.org/sqlite/inventory.json will run at the cli

permissions issues running `pelias compose up`

I was trying to setup a pelias geocoding server using docker and docker-compose, I followed the guidelines in https://github.com/pelias/docker/blob/master/README.md . after downloading all the docker images i used the command pelias compose up to start the services and it shows like

[root@192 portland-metro]# pelias compose up
Creating network "pelias_default" with driver "bridge"
Creating pelias_openstreetmap ... done
Creating pelias_openaddresses ...
Creating pelias_api ...
Creating pelias_pip-service ...
Creating pelias_polylines ...
Creating pelias_elasticsearch ...
Creating pelias_whosonfirst ...
Creating pelias_interpolation ...
Creating pelias_fuzzy_tester ...
Creating pelias_openstreetmap ...
Creating pelias_transit ...
Creating pelias_schema ...
Creating pelias_placeholder ...

Then I checked status of services using command pelias compose ps and it shows like

[root@192 portland-metro]# pelias compose ps
Name Command State Ports

pelias_api ./bin/start Restarting
pelias_elasticsearch /docker-entrypoint.sh elas ... Restarting
pelias_fuzzy_tester ./bin/fuzzy-tester --help Exit 1
pelias_interpolation ./interpolate server /data ... Restarting
pelias_libpostal ./bin/wof-libpostal-server ... Up 0.0.0.0:4400->4400/tcp
pelias_openaddresses /bin/bash Exit 0
pelias_openstreetmap /bin/bash Exit 0
pelias_pip-service ./bin/start Restarting
pelias_placeholder ./cmd/server.sh Restarting
pelias_polylines /bin/bash Exit 0
pelias_schema /bin/bash Exit 0
pelias_transit /bin/bash Exit 0
pelias_whosonfirst /bin/bash Exit 0

Then I checked the log by command pelias compose logs and it gives a list of errors as in the attachment.
logs_output.txt

I can't trace the problem. Somebody please help

can't start elastic search on CentOS Linux release 7.5.1804 (Core)

[ajith@barinlab portland-metro]$ whoami
ajith
[ajith@barinlab portland-metro]$ id -u ajith
1000
[ajith@barinlab portland-metro]$ id -g ajith
1000

userid is same as specified in .env file

[ajith@barinlab portland-metro]$ cat .env 
COMPOSE_PROJECT_NAME=pelias
DOCKER_USER=1000
DATA_DIR=/data
[ajith@barinlab portland-metro]$ 

I executed the following commands in the given order to setup a pelias geocoder.

[ajith@barinlab portland-metro]$ sudo mkdir /code /data

[ajith@barinlab portland-metro]$ sudo chown 1000:1000 /code /data

[ajith@barinlab portland-metro]$ cd /code

[ajith@barinlab portland-metro]$ git clone https://github.com/pelias/docker.git

[ajith@barinlab portland-metro]$ cd docker

[ajith@barinlab portland-metro]$ sudo ln -sf "$(pwd)/pelias" /usr/local/bin/pelias

[ajith@barinlab portland-metro]$ cd projects/portland-metro

[ajith@barinlab portland-metro]$ sed -i '/DATA_DIR/d' .env

[ajith@barinlab portland-metro]$ echo 'DATA_DIR=/data' >> .env

[ajith@barinlab portland-metro]$ mkdir -p /data/elasticsearch/data

[ajith@barinlab portland-metro]$ mkdir -p /usr/share/elasticsearch/data

[ajith@barinlab portland-metro]$ pelias compose pull

[ajith@barinlab portland-metro]$ pelias elastic start

[ajith@barinlab portland-metro]$ pelias elastic wait
waiting for elasticsearch service to come up
..........................................^C
[ajith@barinlab portland-metro]$ 

[ajith@barinlab portland-metro]$ pelias compose ps
        Name                   Command             State      Ports
-------------------------------------------------------------------
pelias_elasticsearch   /bin/bash bin/es-docker   Restarting        
[ajith@barinlab portland-metro]$ 

[ajith@barinlab portland-metro]$ pelias compose logs
Attaching to pelias_elasticsearch
elasticsearch_1  | [2018-11-02T17:39:55,876][WARN ][o.e.b.ElasticsearchUncaughtExceptionHandler] [] uncaught exception in thread [main]
elasticsearch_1  | org.elasticsearch.bootstrap.StartupException: java.lang.IllegalStateException: Unable to access 'path.data' (/usr/share/elasticsearch/data)
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:136) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:123) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:70) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:134) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.cli.Command.main(Command.java:90) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:91) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:84) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | Caused by: java.lang.IllegalStateException: Unable to access 'path.data' (/usr/share/elasticsearch/data)
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Security.addPath(Security.java:450) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Security.addFilePermissions(Security.java:291) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Security.createPermissions(Security.java:246) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Security.configure(Security.java:119) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:228) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:342) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:132) ~[elasticsearch-5.6.12.jar:5.6.12]
elasticsearch_1  | 	... 6 more
.....
...
..

What step I had missed? Why I cant start elastic search? please help...

Originally posted by @ajithcofficial in #33 (comment)

error - no living connections, pls install mandatory plugins before continuing

I am getting the mystery error below - any help appreciated, for instance what might be the 'mandatory plugins' and why wouldnt those already be in the container, whatever they are...

root@jeremy-TECRA-Z40-C-pipl:/home/jeremy/pelias_docker/docker/projects/portland-metro# pelias elastic create
Elasticsearch ERROR: 2018-11-26T12:29:02Z
  Error: Request error, retrying
  GET http://elasticsearch:9200/_nodes => getaddrinfo ENOTFOUND elasticsearch elasticsearch:9200
      at Log.error (/code/pelias/schema/node_modules/elasticsearch/src/lib/log.js:226:56)
      at checkRespForFailure (/code/pelias/schema/node_modules/elasticsearch/src/lib/transport.js:259:18)
      at HttpConnector.<anonymous> (/code/pelias/schema/node_modules/elasticsearch/src/lib/connectors/http.js:163:7)
      at ClientRequest.wrapper (/code/pelias/schema/node_modules/lodash/lodash.js:4935:19)
      at ClientRequest.emit (events.js:182:13)
      at Socket.socketErrorListener (_http_client.js:391:9)
      at Socket.emit (events.js:182:13)
      at emitErrorNT (internal/streams/destroy.js:82:8)
      at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
      at process._tickCallback (internal/process/next_tick.js:63:19)

Elasticsearch WARNING: 2018-11-26T12:29:03Z
  Unable to revive connection: http://elasticsearch:9200/

Elasticsearch WARNING: 2018-11-26T12:29:03Z
  No living connections

{ Error: No Living connections
    at sendReqWithConnection (/code/pelias/schema/node_modules/elasticsearch/src/lib/transport.js:226:15)
    at next (/code/pelias/schema/node_modules/elasticsearch/src/lib/connection_pool.js:214:7)
    at process._tickCallback (internal/process/next_tick.js:61:11) message: 'No Living connections' }
please install mandatory plugins before continuing.

Confused about prerequisites?

Based on the read me there is a need for docker, but then there is also a need for macintosh brew libraries? How are these two things connected, I thought that docker would not need core utils?

Can this be explained or can a clear step by step tutorial be put in here?

memory issue

I hit an out-of-memory on a 64Gb ram aws machine during 'prepare all' for planet . The process continues so I am not sure how that impacts the geocoder, if at all :

+ pelias prepare all
  Creating extract at /data/placeholder/wof.extract
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                                                                                 /data/openstreetmap/planet-latest.osm.pbf is very large.
                                                                                                                                         You will likely experience memory issues working with large extracts like this.
           We strongly recommend using Valhalla to produce extracts for large PBF extracts.
                                                                                           see: https://github.com/pelias/polylines#download!data
                                                                                                                                                 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                     converting /data/openstreetmap/planet-latest.osm.pbf to /data/polylines/extract.0sv
                                                                                                        2019/03/29 17:13:17 found 0 refs for way 23687328
2019/03/29 17:16:04 found 0 refs for way 37090295
2019/03/29 17:16:04 found 0 refs for way 37147040
                                              2019/03/29 17:17:02 found 0 refs for way 47725798
                                                                                               2019/03/29 17:43:00 found 0 refs for way 314692067
Done!
import...
2019/03/29 17:54:32 found 0 refs for way 384162726
                                                  populate fts...
2019/03/29 17:58:00 found 0 refs for way 405118396
                                                  optimize...
close...
Done!
2019/03/29 18:01:56 found 0 refs for way 435366836
                                                  2019/03/29 18:07:10 found 0 refs for way 479715347
                                                                                                    2019/03/29 18:27:13 found 0 refs for way 620069036
                                                                                                                                                      2019/03/29 18:27:13 found 0 refs for way 620069065                                                                                                                                                            2019/03/29 18:27:13 found 0 refs for way 620069094
2019/03/29 18:27:13 found 0 refs for way 620069190
fatal error: runtime: out of memory
runtime stack:
runtime.throw(0x8577c0, 0x16)
/usr/local/go/src/runtime/panic.go:619 +0x81
runtime.sysMap(0xd196ae0000, 0x100000, 0x11fb800, 0xc87c78)
/usr/local/go/src/runtime/mem_linux.go:216 +0x20a
runtime.(*mheap).sysAlloc(0xc6eae0, 0x100000, 0x7fa4cb174698)       
/usr/local/go/src/runtime/malloc.go:470 +0xd4
runtime.(*mheap).grow(0xc6eae0, 0x2, 0x0)
 /usr/local/go/src/runtime/mheap.go:907 +0x60
runtime.(*mheap).allocSpanLocked(0xc6eae0, 0x2, 0xc87c88, 0x7fa4cb174600)
/usr/local/go/src/runtime/mheap.go:820 +0x301
runtime.(*mheap).alloc_m(0xc6eae0, 0x2, 0xc60049, 0xc7dbfb8fe8)
/usr/local/go/src/runtime/mheap.go:686 +0x118      
runtime.(*mheap).alloc.func1()                                                                                                                                                                   /usr/local/go/src/runtime/mheap.go:753 +0x4d 
runtime.(*mheap).alloc(0xc6eae0, 0x2, 0x7fa4cb010049, 0x7fa4cb174600)
/usr/local/go/src/runtime/mheap.go:752 +0x8a     
runtime.(*mcentral).grow(0xc71070, 0x0)
/usr/local/go/src/runtime/mcentral.go:232 +0x94    
runtime.(*mcentral).cacheSpan(0xc71070, 0xffffffffffffffff)
/usr/local/go/src/runtime/mcentral.go:106 +0x2e4 
runtime.(*mcache).refill(0x7fa580eb3000, 0xc7dbfb8f49)                    
/usr/local/go/src/runtime/mcache.go:123 +0x9c

btw prepare all consistently manages to somehow munge up the terminal display , preventing newlines from going to the begninning of the next line - that's happened on 2 debian jessie machines and one ubuntu 18 machine . most of the output above had a bunch of spaces everywhere, i left the first few lines as is

In any case I suppose the issue is either

  • how much memory is needed at minimum for planet
  • do these oom errors cause problems down the line

No permission to create some data/* directories when using example build script

I tried using the example build script in the Git repo readme:

#!/bin/bash
set -x

# create directories
mkdir /code /data

# clone repo
cd /code
git clone https://github.com/pelias/docker.git
cd docker

# install pelias script
ln -s "$(pwd)/pelias" /usr/local/bin/pelias

# cwd
cd projects/portland-metro

# configure environment
sed -i '/DATA_DIR/d' .env
echo 'DATA_DIR=/data' >> .env

# run build
pelias compose pull
pelias elastic start
pelias elastic wait
pelias elastic create
pelias download all
pelias prepare all
pelias import all
pelias compose up

# optionally run tests
pelias test run

I put this into a script called build.sh in the root home directory and used chmod +x build.sh to make it executable.

But even though I was logged into root and created the directory /data, it was only able to create a few of the data/* directories (one of them was elasticsearch) before the script ran into errors indiciating no permission to create the others. This caused the script to fail right before it started downloading the address data:

root@docker-s-2vcpu-4gb-nyc1-01:~# ./build.sh 
+ mkdir /code /data
mkdir: cannot create directory ‘/data’: File exists
+ cd /code
+ git clone https://github.com/pelias/docker.git
Cloning into 'docker'...
remote: Enumerating objects: 24, done.
remote: Counting objects: 100% (24/24), done.
remote: Compressing objects: 100% (20/20), done.
remote: Total 595 (delta 11), reused 11 (delta 4), pack-reused 571
Receiving objects: 100% (595/595), 213.91 KiB | 10.70 MiB/s, done.
Resolving deltas: 100% (351/351), done.
+ cd docker
++ pwd
+ ln -s /code/docker/pelias /usr/local/bin/pelias
+ cd projects/portland-metro
+ sed -i /DATA_DIR/d .env
+ echo DATA_DIR=/data
+ pelias compose pull
Pulling libpostal     ... done
Pulling schema        ... done
Pulling api           ... done
Pulling placeholder   ... done
Pulling whosonfirst   ... done
Pulling openstreetmap ... done
Pulling openaddresses ... done
Pulling transit       ... done
Pulling polylines     ... done
Pulling interpolation ... done
Pulling pip           ... done
Pulling elasticsearch ... done
Pulling fuzzy-tester  ... done
+ pelias elastic start
Creating network "pelias_default" with driver "bridge"
Creating pelias_elasticsearch ... done
+ pelias elastic wait
waiting for elasticsearch service to come up
....
+ pelias elastic create

--------------
 create index 
--------------

[put mapping] 	 pelias { acknowledged: true } 

+ pelias download all
wget http://biketownpdx.socialbicycles.com/opendata/station_information.json to /data/transit/BIKETOWN-hubs.json
Could not use "nc", falling back to slower node.js method for sync requests.
{ Error: ENOENT: no such file or directory, open '/data/transit/BIKETOWN-hubs.json'
    at Object.fs.openSync (fs.js:646:18)
    at Object.fs.writeFileSync (fs.js:1299:33)
    at downloadFile (/code/pelias/transit/lib/prep_data.js:83:10)
    at /code/pelias/transit/lib/prep_data.js:35:17
    at Array.forEach (<anonymous>)
    at Object.<anonymous> (/code/pelias/transit/lib/prep_data.js:25:21)
    at Module._compile (module.js:652:30)
    at Object.Module._extensions..js (module.js:663:10)
    at Module.load (module.js:565:32)
    at tryModuleLoad (module.js:505:12)
  errno: -2,
  code: 'ENOENT',
  syscall: 'open',
  path: '/data/transit/BIKETOWN-hubs.json' }
Had an issue obtaining new file /data/transit/BIKETOWN-hubs.json
wget http://developer.trimet.org/schedule/gtfs.zip to /data/transit/TRIMET-stops.txt
info: [openaddresses-download] Attempting to download selected data files: us/or/portland_metro.csv,us/or/city_of_salem.csv,us/or/marion_and_polk.csv,us/or/marion.csv,us/or/hood_river.csv,us/wa/city_of_richland.csv,us/wa/clark.csv
error: [openaddresses-download] error making directory /data/openaddresses message=EACCES: permission denied, mkdir '/data/openaddresses', stack=Error: EACCES: permission denied, mkdir '/data/openaddresses', errno=-13, code=EACCES, syscall=mkdir, path=/data/openaddresses
error: [download] Failed to download data message=EACCES: permission denied, mkdir '/data/openaddresses', stack=Error: EACCES: permission denied, mkdir '/data/openaddresses', errno=-13, code=EACCES, syscall=mkdir, path=/data/openaddresses
/code/pelias/interpolation/node_modules/fs-extra/lib/mkdirs/mkdirs-sync.js:45
        throw err0
        ^

Error: EACCES: permission denied, mkdir '/data/tiger'
    at Object.fs.mkdirSync (fs.js:885:18)
    at mkdirsSync (/code/pelias/interpolation/node_modules/fs-extra/lib/mkdirs/mkdirs-sync.js:31:9)
    at Object.mkdirsSync (/code/pelias/interpolation/node_modules/fs-extra/lib/mkdirs/mkdirs-sync.js:36:14)
    at downloadFilteredFiles (/code/pelias/interpolation/script/js/update_tiger.js:99:6)
    at /code/pelias/interpolation/node_modules/async/dist/async.js:3880:24
    at replenish (/code/pelias/interpolation/node_modules/async/dist/async.js:1011:17)
    at iterateeCallback (/code/pelias/interpolation/node_modules/async/dist/async.js:995:17)
    at /code/pelias/interpolation/node_modules/async/dist/async.js:969:16
    at /code/pelias/interpolation/node_modules/async/dist/async.js:3885:13
    at context.ftp.list (/code/pelias/interpolation/script/js/update_tiger.js:91:5)
/code/pelias/whosonfirst/node_modules/fs-extra/lib/mkdirs/mkdirs-sync.js:45
        throw err0
        ^

Error: EACCES: permission denied, mkdir '/data/whosonfirst'
    at Object.fs.mkdirSync (fs.js:885:18)
    at mkdirsSync (/code/pelias/whosonfirst/node_modules/fs-extra/lib/mkdirs/mkdirs-sync.js:31:9)
    at Object.mkdirsSync (/code/pelias/whosonfirst/node_modules/fs-extra/lib/mkdirs/mkdirs-sync.js:36:14)
    at download (/code/pelias/whosonfirst/utils/sqlite_download.js:15:6)
    at Object.<anonymous> (/code/pelias/whosonfirst/utils/download_data.js:15:3)
    at Module._compile (module.js:652:30)
    at Object.Module._extensions..js (module.js:663:10)
    at Module.load (module.js:565:32)
    at tryModuleLoad (module.js:505:12)
    at Function.Module._load (module.js:497:3)
info: [openstreetmap-download] Downloading sources: https://s3.amazonaws.com/metro-extracts.nextzen.org/portland_oregon.osm.pbf
error: [openstreetmap-download] error making directory /data/openstreetmap message=EACCES: permission denied, mkdir '/data/openstreetmap', stack=Error: EACCES: permission denied, mkdir '/data/openstreetmap', errno=-13, code=EACCES, syscall=mkdir, path=/data/openstreetmap
error: [openstreetmap-download] Failed to download data message=EACCES: permission denied, mkdir '/data/openstreetmap', stack=Error: EACCES: permission denied, mkdir '/data/openstreetmap', errno=-13, code=EACCES, syscall=mkdir, path=/data/openstreetmap

At this point, it's running, but appears to error out when a request comes in:

~ > curl '159.89.127.11:4000/v1/search?text=65%20Front%20St.%20W.%20Toronto%20Ontario%20Canada' | jq '.'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   486  100   486    0     0   4909      0 --:--:-- --:--:-- --:--:--  4909
{
  "geocoding": {
    "version": "0.2",
    "attribution": "http://159.89.127.11:4000/attribution",
    "query": {
      "text": "65 Front St. W. Toronto Ontario Canada",
      "size": 10,
      "private": false,
      "focus.point.lat": 45.52,
      "focus.point.lon": -122.67,
      "lang": {
        "name": "English",
        "iso6391": "en",
        "iso6393": "eng",
        "defaulted": true
      },
      "querySize": 20
    },
    "errors": [
      "connect ECONNREFUSED 172.18.0.4:4400"
    ],
    "engine": {
      "name": "Pelias",
      "author": "Mapzen",
      "version": "1.0"
    },
    "timestamp": 1540156052358
  },
  "type": "FeatureCollection",
  "features": []
}

invalid regex test

During pelias import all (or maybe the test run) I hit some 'invalid regex test' errors as below. Also I'm looking for guidance if th number of misses vs. hits should look so skewed towards misses.

...
[pbf2json]: 2019/04/09 15:55:03 denormalize failed for way: 679144706 node not found: 6359467361                                                                                                                                                                                                                    
...
{"calls":33955360,"hits":325910,"misses":33629450}
info: [admin-lookup:worker] region worker process exiting, stats: {"calls":33629450,"hits":33035765,"misses":593685}
info: [admin-lookup:worker] localadmin worker process exiting, stats: {"calls":59668214,"hits":11729866,"misses":47938348}
info: [admin-lookup:worker] empire worker process exiting, stats: {"calls":119193,"hits":0,"misses":119193}
info: [admin-lookup:worker] ocean worker process exiting, stats: {"calls":34409,"hits":17605,"misses":16804}
info: [admin-lookup:worker] dependency worker process exiting, stats: {"calls":593685,"hits":22009,"misses":571676}
info: [admin-lookup:worker] neighbourhood worker process exiting, stats: {"calls":100315152,"hits":13602598,"misses":86712554}
info: [admin-lookup:worker] borough worker process exiting, stats: {"calls":100315152,"hits":3206852,"misses":97108300}
info: [admin-lookup:worker] locality worker process exiting, stats: {"calls":97108300,"hits":37440086,"misses":59668214}
info: [admin-lookup:worker] country worker process exiting, stats: {"calls":571676,"hits":452483,"misses":119193}
info: [admin-lookup:worker] continent worker process exiting, stats: {"calls":119193,"hits":36323,"misses":82870}
info: [admin-lookup:worker] macrocounty worker process exiting, stats: {"calls":34048540,"hits":93180,"misses":33955360}
info: [admin-lookup:worker] county worker process exiting, stats: {"calls":47938348,"hits":13889808,"misses":34048540}
info: [dbclient-openstreetmap]  paused=false, transient=0, current_length=0, indexed=100315152, batch_ok=200631, batch_retries=0, failed_records=0, venue=22752709, address=77562443, persec=165.2
...
info: [wof-pip-service:master] starting with layers neighbourhood,borough,locality,localadmin,county,macrocounty,macroregion,region,dependency,country,empire,continent,marinearea,ocean
info: [wof-pip-service:master] empire worker loaded 0 features in 0.052 seconds
info: [wof-pip-service:master] ocean worker loaded 7 features in 0.09 seconds
info: [wof-pip-service:master] dependency worker loaded 32 features in 0.643 seconds
info: [wof-pip-service:master] continent worker loaded 8 features in 0.793 seconds
info: [wof-pip-service:master] macrocounty worker loaded 23 features in 0.818 seconds
info: [wof-pip-service:master] macroregion worker loaded 25 features in 0.842 seconds
info: [wof-pip-service:master] marinearea worker loaded 305 features in 2.66 seconds
info: [wof-pip-service:master] borough worker loaded 138 features in 3.983 seconds
info: [wof-pip-service:master] country worker loaded 199 features in 9.17 seconds
info: [wof-pip-service:master] region worker loaded 4268 features in 51.517 seconds
info: [wof-pip-service:master] county worker loaded 24845 features in 155.089 seconds
info: [wof-pip-service:master] neighbourhood worker loaded 17726 features in 259.723 seconds
info: [wof-pip-service:master] localadmin worker loaded 99206 features in 412.208 seconds
info: [wof-pip-service:master] locality worker loaded 143249 features in 506.709 seconds
info: [wof-pip-service:master] PIP Service Loading Completed!!!
info: [dbclient-polylines]  paused=false, transient=1, current_length=204
...
info: [dbclient-polylines]  paused=false, transient=0, current_length=459, indexed=276000, batch_ok=552, batch_retries=0, failed_records=0, street=276000, persec=4000
error: [polyline] polyline document error message=invalid regex test, http://www.hembygd.se/lagunda/nysatra-kyrkstig/ should not match /https?:\/\//, stack=PeliasModelError: invalid regex test, http://www.hembygd.se/lagunda/nysatra-kyrkstig/ should not match /https?:\/\//
    at Object.nomatch (/code/pelias/polylines/node_modules/pelias-model/util/valid.js:117:13)
    at Document.setName (/code/pelias/polylines/node_modules/pelias-model/Document.js:258:18)
    at DestroyableTransform._transform (/code/pelias/polylines/stream/document.js:30:11)
    at DestroyableTransform.Transform._read (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:177:10)
    at DestroyableTransform.Readable.read (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:440:10)
    at flow (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:898:34)
    at ParallelTransform.pipeOnDrainFunctionResult (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:708:7)
    at ParallelTransform.emit (events.js:182:13)
    at onwriteDrain (/code/pelias/polylines/node_modules/parallel-transform/node_modules/readable-stream/lib/_stream_writable.js:501:12)
    at afterWrite (/code/pelias/polylines/node_modules/parallel-transform/node_modules/readable-stream/lib/_stream_writable.js:489:18), name=PeliasModelError
error: [polyline] polyline document error message=invalid regex test, http://www.hembygd.se/lagunda/nysatra-kyrkstig/ should not match /https?:\/\//, stack=PeliasModelError: invalid regex test, http://www.hembygd.se/lagunda/nysatra-kyrkstig/ should not match /https?:\/\//
    at Object.nomatch (/code/pelias/polylines/node_modules/pelias-model/util/valid.js:117:13)
    at Document.setName (/code/pelias/polylines/node_modules/pelias-model/Document.js:258:18)
    at DestroyableTransform._transform (/code/pelias/polylines/stream/document.js:30:11)
    at DestroyableTransform.Transform._read (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:177:10)
    at DestroyableTransform.Readable.read (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:440:10)
    at flow (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:898:34)
    at ParallelTransform.pipeOnDrainFunctionResult (/code/pelias/polylines/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:708:7)
    at ParallelTransform.emit (events.js:182:13)
    at onwriteDrain (/code/pelias/polylines/node_modules/parallel-transform/node_modules/readable-stream/lib/_stream_writable.js:501:12)
    at afterWrite (/code/pelias/polylines/node_modules/parallel-transform/node_modules/readable-stream/lib/_stream_writable.js:489:18), name=PeliasModelError
info: [dbclient-polylines]  paused=false, transient=1, current_length=7, indexed=316000, batch_ok=632, batch_retries=0, failed_records=0, street=316000, persec=4000

Allow Elasticsearch 5 container to run as configurable user

Right now, the pelias/elasticsearch:5.6.12 image does not support running as a user other than uid 1000.

Unfortunately all the internal scripts have permissions for a specific user, so if there is a line in docker-compose.yml like user: 2000, it will fail.

License expiring

Started my elastic-container, running pelias/elasticsearch:5.6.12, and got the following in the log:

(I deleted #'s before each row to calm markdown a bit)


 License [will expire] on [Thursday, June 27, 2019]. If you have a new license, please update it.
 Otherwise, please reach out to your support contact.
 
 Commercial plugins operate with reduced functionality on license expiration:
 - security
  - Cluster health, cluster stats and indices stats operations are blocked
  - All data operations (read and write) continue to work
 - watcher
  - PUT / GET watch APIs are disabled, DELETE watch API continues to work
  - Watches execute and write to the history
  - The actions of the watches don't execute
 - monitoring
  - The agent will stop collecting cluster and indices metrics
  - The agent will stop automatically cleaning indices older than [xpack.monitoring.history.duration]
 - graph
  - Graph explore APIs are disabled
 - ml
  - Machine learning APIs are disabled
 - deprecation
  - Deprecation APIs are disabled
 - upgrade
  - Upgrade API is disabled

I don't think I've seen it before - is this anything one should be worried about?

Problem with forward gocoding results

Hello guys, I use Pelias in docker and I have some problem with forward geocoding . The problems is when I geocode adders from same city on the same street, the geocoder returns incomplete informations. I present this situtaion based on two address example like :

Słoneczna 1 Myślenice [street,city,number]
Słoneczna 4 Myślenice [street,city,number]
So when i geoocde the first adders the gecoder returns :

{u'geometry': {u'type': u'Point', u'coordinates': [19.947045, 49.836457]}, u'type': u'Feature', u'properties': {u'layer': u'address', u'match_type': u'exact', u'county_gid': u'whosonfirst:county:102079441', u'region_gid': u'whosonfirst:region:85687291', u'county': u'My\xc5lenicki', u'street': u'S\u0142oneczna', u'country_a': u'POL', u'label': u'1 S\u0142oneczna, My\u015blenice, Poland', u'id': u'pl/dane_ref:580c0a82-29bb-4a70-85e0-ba0f7c735e58', u'confidence': 1, u'locality': u'My\u015blenice', u'continent': u'Europe', u'source': u'openaddresses', u'gid': u'openaddresses:address:pl/dane_ref:580c0a82-29bb-4a70-85e0-ba0f7c735e58', u'housenumber': u'1', u'accuracy': u'point', u'country_gid': u'whosonfirst:country:85633723', u'source_id': u'pl/dane_ref:580c0a82-29bb-4a70-85e0-ba0f7c735e58', u'postalcode': u'32-400', u'continent_gid': u'whosonfirst:continent:102191581', u'distance': 4.415, u'name': u'1 S\u0142oneczna', u'locality_gid': u'whosonfirst:locality:101826165', u'country': u'Poland', u'region': u'Lesser Poland Voivodeship', u'region_a': u'MA'}} - This is a raw_result from geopy

1 Słoneczna, Myślenice, Poland - This is a label from raw_result
In this example everything is ok .

Second example :
{u'geometry': {u'type': u'Point', u'coordinates': [19.939914, 49.833548]}, u'type': u'Feature', u'properties': {u'layer': u'locality', u'match_type': u'fallback', u'county_gid': u'whosonfirst:county:102079441', u'region_gid': u'whosonfirst:region:85687291', u'county': u'My\xc5lenicki', u'country_a': u'POL', u'label': u'My\u015blenice, Poland', u'continent': u'Europe', u'confidence': 0.6, u'locality': u'My\u015blenice', u'id': u'101826165', u'source': u'whosonfirst', u'gid': u'whosonfirst:locality:101826165', u'accuracy': u'centroid', u'country_gid': u'whosonfirst:country:85633723', u'source_id': u'101826165', u'continent_gid': u'whosonfirst:continent:102191581', u'distance': 5, u'name': u'My\u015blenice', u'locality_gid': u'whosonfirst:locality:101826165', u'country': u'Poland', u'region': u'Lesser Poland Voivodeship', u'region_a': u'MA'}, u'bbox': [19.8978956349, 49.8266998886, 19.9624726392, 49.8429191096]} - This is a raw_result from geopy

Myślenice, Poland - This is a label from raw_result

Sth is wrong. Why in this example the geocoder returns the centroid of city ? I have this address in database from openaddresses so why geocoder dosent return appropriate coordinates?

So after that i decide to geocode this address but without city name. Gecoder returns a few addresses from different city,among them the one that interested me. ( I recognized this adders by postal code)
{u'geometry': {u'type': u'Point', u'coordinates': [19.949678, 49.832716]}, u'type': u'Feature', u'properties': {u'layer': u'address', u'match_type': u'exact', u'county_gid': u'whosonfirst:county:102079441', u'region_gid': u'whosonfirst:region:85687291', u'county': u'My\xc5lenicki', u'street': u'S\u0142oneczna', u'country_a': u'POL', u'label': u'4 S\u0142oneczna, Poland', u'continent': u'Europe', u'confidence': 1, u'id': u'pl/dane_ref:4d9f04f9-6eb3-4a45-867e-a955834650a8', u'source': u'openaddresses', u'gid': u'openaddresses:address:pl/dane_ref:4d9f04f9-6eb3-4a45-867e-a955834650a8', u'housenumber': u'4', u'accuracy': u'point', u'country_gid': u'whosonfirst:country:85633723', u'source_id': u'pl/dane_ref:4d9f04f9-6eb3-4a45-867e-a955834650a8', u'postalcode': u'32-400', u'continent_gid': u'whosonfirst:continent:102191581', u'distance': 4.373, u'name': u'4 S\u0142oneczna', u'country': u'Poland', u'region': u'Lesser Poland Voivodeship', u'region_a': u'MA'}}
4 Słoneczna, Poland - This is a label from raw_result
The question is why in this results label is without the City name (WOF locality) ? Is it a problem with WOF data or datbase?

I am so confused because without the city name (locality) in results it is hard to decide that this results is correct. What should i do in this situation?

"prepare all" breaks with an error in import.sh

Been following the steps but got stuck on prepare all. Willing to poke at it myself, but would appreciate any pointers as to where to start with this.

OS: OSX 10.12.6

converting /data/openstreetmap/portland_oregon.osm.pbf to /data/polylines/extract.0sv
Creating extract at /data/placeholder/wof.extract
Done!
import...
populate fts...
optimize...
close...
Done!
2018/11/07 12:25:35 found 0 refs for way 5286456
2018/11/07 12:25:36 found 0 refs for way 5298970
2018/11/07 12:25:36 found 0 refs for way 5324246
2018/11/07 12:25:36 found 0 refs for way 5327816
2018/11/07 12:25:37 found 0 refs for way 5405306
2018/11/07 12:25:37 found 0 refs for way 5405439
2018/11/07 12:25:37 found 0 refs for way 5405464
2018/11/07 12:25:39 found 0 refs for way 5856772
2018/11/07 12:25:39 found 0 refs for way 5857544
2018/11/07 12:25:39 found 0 refs for way 5857801
2018/11/07 12:25:39 found 0 refs for way 5861021
2018/11/07 12:25:39 found 0 refs for way 5862738
2018/11/07 12:25:39 found 0 refs for way 5862863
2018/11/07 12:25:39 found 0 refs for way 5865809
2018/11/07 12:25:39 found 0 refs for way 5869233
2018/11/07 12:25:40 found 0 refs for way 34445840
2018/11/07 12:25:41 found 0 refs for way 89783436
2018/11/07 12:25:42 found 0 refs for way 116187614
2018/11/07 12:25:42 found 0 refs for way 116211262
2018/11/07 12:25:44 found 0 refs for way 169287313
2018/11/07 12:25:45 found 0 refs for way 178881852
2018/11/07 12:25:46 found 0 refs for way 234502184
2018/11/07 12:25:48 found 0 refs for way 481849521
wrote polylines extract
-rw-r--r-- 1 pelias pelias 3.8M Nov  7 12:26 /data/polylines/extract.0sv
- importing polylines
/code/pelias/interpolation/script/import.sh: line 41:    21 Broken pipe             cat $POLYLINE_FILE
        22 Killed                  | node $DIR/../cmd/polyline.js $STREET_DB > $PROC_STDOUT 2> $PROC_STDERR

pelias import all -- gives errors on whosonfirst csv's and warning on wof service

Hi,

Trying to run the sample docker example on Portland data. I have not changed anything but the data dir in .env file. All my previous commands have ran without any errors, but I get these errors on elasticsearch while it's trying to import certain whosonfirst datasets specifically - whosonfirst-data-neighbourhood-latest.csv and whosonfirst-data-postalcode-latest.csv

image

But then also it says import is done and does not show any failed_records and also shows some unable to locate errors on
image

Basically is this expected to happen and can I safely ignore them or something is missing.
@missinglink @orangejulius

Appreciate any help !

pelias test run hangs?

When I try pelias test run after pelias compose up it seems to hang (no further output from test run since ~8 hrs ago) - do I need to run compose up again until all the processes say 'done'?

ubuntu@ip-172-30-0-227:~/pelias/projects/planet$ pelias compose up
pelias_interpolation is up-to-date
Starting pelias_fuzzy_tester ...
Starting pelias_openstreetmap ...
Starting pelias_openaddresses ...
pelias_libpostal is up-to-date
Starting pelias_csv_importer  ...
Starting pelias_fuzzy_tester  ... done
Starting pelias_openstreetmap ... done
Starting pelias_openaddresses ... done
Starting pelias_csv_importer  ... done
Starting pelias_whosonfirst   ... done
Starting pelias_polylines     ... done
Starting pelias_schema        ... done
Starting pelias_geonames      ... done
Starting pelias_transit       ... done
ubuntu@ip-172-30-0-227:~/pelias/projects/planet$ pelias test run


pelias import all error ENOTFOUND - using a proxy

after pelias prepare all worked, I now get another ENOTFOUND error as below...

deploy@dap-jupyter01:/mnt/open_street_map/pelias_docker/docker/projects/portland-metro$ pelias prepare all
...
Build completed!

deploy@dap-jupyter01:/mnt/open_street_map/pelias_docker/docker/projects/portland-metro$ pelias import all
info: [whosonfirst] Loading whosonfirst-data-continent-latest.csv records from /data/whosonfirst/meta
Elasticsearch ERROR: 2019-03-18T22:07:11Z
  Error: Request error, retrying
  HEAD http://elasticsearch:9200/pelias => getaddrinfo ENOTFOUND elasticsearch elasticsearch:9200
      at Log.error (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/log.js:226:56)
      at checkRespForFailure (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/transport.js:259:18)
      at HttpConnector.<anonymous> (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/connectors/http.js:164:7)
      at ClientRequest.wrapper (/code/pelias/whosonfirst/node_modules/lodash/lodash.js:4935:19)
      at ClientRequest.emit (events.js:182:13)
      at Socket.socketErrorListener (_http_client.js:392:9)
      at Socket.emit (events.js:182:13)
      at emitErrorNT (internal/streams/destroy.js:82:8)
      at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
      at process._tickCallback (internal/process/next_tick.js:63:19)

info: [whosonfirst] Loading whosonfirst-data-country-latest.csv records from /data/whosonfirst/meta
info: [whosonfirst] Loading whosonfirst-data-region-latest.csv records from /data/whosonfirst/meta
info: [whosonfirst] Loading whosonfirst-data-county-latest.csv records from /data/whosonfirst/meta
Elasticsearch WARNING: 2019-03-18T22:07:11Z
  Unable to revive connection: http://elasticsearch:9200/

Elasticsearch WARNING: 2019-03-18T22:07:11Z
  No living connections

ERROR: Elasticsearch index pelias does not exist
You must use the pelias-schema tool (https://github.com/pelias/schema/) to create the index first
For full instructions on setting up Pelias, see http://pelias.io/install.html
/code/pelias/whosonfirst/node_modules/pelias-dbclient/src/configValidation.js:38
          throw new Error(`elasticsearch index ${config.schema.indexName} does not exist`);
          ^

Error: elasticsearch index pelias does not exist
    at existsCallback (/code/pelias/whosonfirst/node_modules/pelias-dbclient/src/configValidation.js:38:17)
    at respond (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/transport.js:327:9)
    at sendReqWithConnection (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/transport.js:226:7)
    at next (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/connection_pool.js:214:7)
    at process._tickCallback (internal/process/next_tick.js:61:11)
deploy@dap-jupyter01:/mnt/open_street_map/pelias_docker/docker/projects/portland-metro$ pelias elastic status
200

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.