Code Monkey home page Code Monkey logo

transitous's Introduction

Transitous

Free and open public transport routing.

Goal

A community-run provider-neutral international public transport routing service.

Using openly available GTFS/GTFS-RT/etc. feeds and FOSS routing engine we want to operate a routing service that:

  • focuses on the interest of the user rather than the public transport operators
  • is free to use
  • values user privacy
  • does not stop at borders
  • aims at crowd-sourced maintenance of data feeds in the spirit of FOSS

Contact

For general discussions about data availability: #opentransport:matrix.org

For Transitous-specific technical topics: #transitous:matrix.spline.de

Adding a region

Find the documentation on the Project Website.

transitous's People

Contributors

1maxnet1 avatar airon90 avatar anbraten avatar bro66666 avatar edwardbetts avatar espidev avatar fale avatar felixguendling avatar floedelmann avatar gallaecio avatar gildas-gh avatar jbruechert avatar jjesse avatar jscott0 avatar kalon33 avatar kbloom avatar kbroulik avatar kd2flz avatar kkremitzki avatar markstos avatar mbruchert avatar mexon avatar mlundblad avatar parttimedatascientist avatar pi-cla avatar steve-tech avatar visika avatar vkrause avatar wkulesza avatar yasinkaraaslan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

transitous's Issues

gtfstidy: Add support for swapping coordinates

Implement a check in gtfstidy that finds unrealistic goordinates (swapped longitude and latitude), and corrects them.

This could be done like the following:

  • If the speed between two stops is MUCH larger then the overall average speed
  • Note the second stop
  • If the speed between the second and the next stop is unrealistic again, swap coordinates for the second stop

Set up a Woodpecker CI instance at Spline

We have reached the limits of free GitHub Actions runners. Until this issue is implemented, we can't process data updates or add new feeds.

Woodpecker allows not running for pull requests, which otherwise becomes a big issue if people can run random stuff in the university network.

For Pull Requests, we can just keep using GitHub Actions.

US: add feeds for major metropolitan cities (Los Angeles, San Francisco, New York City, Boston, Washington, etc.)

Those are very populous cities, and also tend to be tourist attractions, so it might make sense to add them first? I suspect these cities all have open transit data. Looking at OpenMobilityData transit feeds for the USA, I can see:

Implement MAAS and Juju support

Another major method of deploying bare metal servers, virtual machines or containers other than Ansible is MAAS originally developed by Canonical. The MAAS software has a fair number of major users and who knows how many minor or unlisted users. So it would be a very good idea to ensure that the necessary resources are in place to enable building containers, or deployment via VM etc with MAAS. Ensuring a secured rsync connection as well as any other connections which are required.

Then to enable building the container support for Juju would be great so it can build the container, as part of a DevOps cycle.

Can support for the please be added?

https://maas.io/
https://juju.is/

European coverage status

Tracking issue for coverage in Europe

  • 🇦🇱 Albania: unknown
  • 🇦🇩 Andorra: unknown
  • 🇦🇹 Austria: national railway included, local transport available with contract? (#402)
  • 🇧🇪 Belgium
  • 🇧🇦 Bosnia and Herzegovina: unknown
  • 🇧🇬 Bulgaria: unknown
  • 🇭🇷 Croatia: national railway included (#109), local transport unknown
  • 🇨🇾 Cyprus: Added feeds from Transitland covering (most?) of southern Cyprus #240
  • 🇨🇿 Czechia: Prague #208, Brno, Liberec #224 , Olomuc #228
  • 🇩🇰 Denmark
  • 🇪🇪 Estonia
  • 🇫🇮 Finland: #116, local data also available (#126, #128)
  • 🇫🇷 France: #305
  • 🇩🇪 Germany
  • 🇬🇷 Greece: Athens #238
  • 🇭🇺 Hungary: #134, more local feeds to be added
  • 🇮🇸 Iceland: #115
  • 🇮🇪 Ireland
  • 🇮🇹 Italy: more feeds to be added? #76 #77
  • 🇽🇰 Kosovo: unknown
  • 🇱🇻 Latvia
  • 🇱🇮 Liechtenstein: largely covered by the Swiss national feed
  • 🇱🇹 Lithuania
  • 🇱🇺 Luxembourg
  • 🇲🇹 Malta: unknown
  • 🇲🇩 Moldova: unknown
  • 🇲🇨 Monaco: unknown
  • 🇲🇪 Montenegro: unknown
  • 🇳🇱 Netherlands
  • 🇲🇰 Macedonia: unknwon
  • 🇵🇱 Poland: national railway included, partial local coverage?
  • 🇵🇹 Portugal: Some local transport covered
  • 🇷🇴 Romania: unknown
  • 🇸🇲 San Marino: unknown
  • 🇷🇸 Serbia: unknown
  • 🇸🇰 Slovakia: #108 , local transport: Bratislava #231
  • 🇸🇮 Slovenia: unknown
  • 🇪🇸 Spain: national railway included, partial local coverage?
  • 🇸🇪 Sweden
  • 🇨🇭 Switzerland
  • 🇹🇷 Turkey: some local feeds exist
  • 🇺🇦 Ukraine: unknown
  • 🇬🇧 United Kingdom: Great Britain done, Northern Ireland (#124)
  • 🇦🇽 Åland: no known GTFS feed for local bus service, feed for ferry service seems available
  • 🇫🇴 Faroe: unknown
  • 🇸🇯 Svalbard: unknown

Use git instead of wiki pages

I've noticed that the documentation was moved from README to GitHub's Wiki pages. What's the advantage of it? from my point of view this is very unusable, since the documentation is not being downloaded (and updated) with the git repo anymore.

What's everyone's view?

Integration of GTFS and OSM

While using GTFS sources separately from OSM may be easier and already has available solutions, the project could get extra detail and other benefits if at least some of the GTFS feeds were better coupled with OSM data.
https://wiki.openstreetmap.org/wiki/Proposal:GTFS_Tagging_Standard
This proposal addresses integration between GTFS and OSM.
This also makes PTv2 relations in OSM more useful for real-life routing.

I hope it may become useful.

Run import on separate machine

The import of openstreetmap data needs a large amount of RAM and CPU time.
We should run it on a separate server in order to not impair the performance.

Maybe Spline has enough resources to host a separate import vm, but I'm not sure.
The import vm needs ~128G RAM, and ideally ~200GB SSD storage.

I would like to run the import publicly visible, for example on GitHub Actions, in order to enable more people to fix things without having access to the server.

Similar work

Hi, thanks for your cool project.

I happen to work on the same task with the scope of France.

It's for a french alternative to google map, https://cartes.app.

I run both a node GTFS server (github.com/laem/GTFS) and a motis server (github.com/laem/motifs).

Last week, I was starting a Deno script on the first repo that would listen to transport.data.gouv.fr. It looks like one of your todos. Very interested to share or join :)

fi-tampere: Processing fails

Error while parsing GTFS feed:
Could not open required file agency.txt
Parsing GTFS feed in '/transitous/downloads/fi_tampere.gtfs.zip' ...
You may want to try running gtfstidy with --fix for error fixing / skipping. See --help for details.

Please consider removing or replacing the feed.

@vkrause

Rewrite Data Import Pipeline for Woodpecker CI

We now have a Woodpecker CI instance running at https://routing-import.spline.de. This repository is already enabled in it, and the RSYNC_PRIVATE_KEY secret is already copied over.
For this repository, the out and the downloads folder should already be always retained, so no further cache setup should be necessary.

That leaves converting the pipeline syntax.

Create a website

Approximate contents:

  • Very short description of the service / slogan
  • A search field to try our routing, should just forward the request to the motis web interface. This may need extending motis with query parameters for this purpose. Maybe a simplified version of what https://www.vy.no/en#journey-planner does
  • Further down some API example, link to motis docs / our docs in case we have some by then.

Removed feeds

During the migration of the CI to a self-hosted VM, I had to remove a few feeds that were not accessible from the DFN (German University Network)

If you find another download link / a mirror, these can of course be added back.
Maybe @Fale's gtfsproxy can help, although it didn't immidiately work for those feeds.

OSM routing solution

There's multiple different OSM-based routing modules in MOTIS:

  • osrm (with foot, bike and car profile)
  • ppr
  • parking (does that actually route itself or delegate to osrm-foot/ppr?)
  • Valhalla (experimental, not used by default, meant as an alternative for some of the other)

Those are used from other modules:

  • nigiri can use either one of those for enabling intermodal routing
  • gbfs depends on osrm-foot explicitly (as that's faster than ppr, although ppr would produce equally usable results)
  • parking depends on ???

And there's issues with those:

  • osrm, ppr and parking fail to import the full European dataset, running out of memory.
  • osrm-foot is significantly faster than ppr (therefore used by gbfs) but ppr has the better results due to supporting profiles (and thus preferred for walking routes), but both use distinct databases and thus having both significantly increase the data import cost.
  • parking is even more expensive to import
  • Valhalla has a bit of missing API to be a viable full replacement for osrm and ppr, see motis-project/motis#439 (comment).
  • Valhalla integration needs work due to significant memory leaks (see also motis-project/motis#439).

How do we resolve this?

Possible options:

(1) No intermodal routing, disable all of this.

Worst possible outcome IMHO, but can be a useful temporary step to unblock work on the GTFS pipeline while this issue is investigated.

(2) Make OSRM/PPR import work

At this point this would likely mean obtaining large high speed SSD storage and enable swapping. OSRM documentation suggests with a total of 350G a full planet file is importable. We have to consider that this isn't a one-time thing though, there is no support for incremental update. Whether ppr and parking also work with this then is unknown, at least parking seemed 3x more RAM-expensive in the local CH test.

When done all MOTIS features would become available.

(3) Make Valhalla work as a replacement to OSRM/PPR

This would likely mean development efforts on Valhalla and/or MOTIS. Valhalla seems able to import the full EU dataset with less than 24G RAM (although I/O heavy), so even a full planet import is not unthinkable. Valhalla does have support for foot/bike/car routing and could thus in theory replace osrm and possibly ppr (profile support to be verified). gbfs doesn't work with Valhalla yet, how that will perform eventually is unknown. And how this impacts/relates to parking is also unknown.

Going this way would enable intermodal routing for foot/bike/car in a first step, while gbfs and parking would probably take longer to become available.

Error following local transitous instructions

Hello,

I was following the local instance instructions in the README and while running the last step:

cd out
motis -c config.ini --server.host 0.0.0.0 --server.static_path /opt/motis/web

ran into the following error:

2024-03-05T00:29:04Z [VERIFY FAIL] file does not exist: land-polygons-complete-4326.zip

initialization error: file does not exist: land-polygons-complete-4326.zip

I was able to download that file from https://osmdata.openstreetmap.de/data/land-polygons.html, put it in my out/ directory, and proceed as expected, so presumably, just a README update would be enough to resolve this.

Focus of transitous on routing or public transport data?

As the readme stated, transitous aims to be a routing service:

A community-run provider-neutral international public transport routing service.

Will transitous (maybe motis is already doing that, but couldn't find anything in the docs) provide additional data like stop_times, trips, gbfs station data like deep-links to rent a vehicle and so on or is this out of scope.

Background:

For an OSS app (similar might be of interest to other apps, websites as well) we are showing realtime data from various different sources like gtfs, gtfs-rt, gbfs, ... and would be really interested instead of developing our own database / data-hub work on an open community project collecting and sharing this data.

For testing I've recently started a small project importing gtfs, gtfs-rt and gbfs feeds into a postgis database which provides data via rest and sends updates via websocket as well: https://github.com/kiel-live/transport-hub (something like this might also be interesting for having a dashboard / showing coverage of the transitious sources)

General todo list

  • Add basic metadata format
  • Add code to download things based on that
  • Run gtfstidy on the result
  • Add CI
  • Add GitHub API integration to create issues on download issues, and assign maintainer to issue
  • Don't download files that haven't changed
  • upload out/ dir somewhere
  • Create ansible playbook to set up servers

image

Parellelize feed processing

gtfstidy is not using multi-threading itself, so running up to $(nproc) gtfstidy jobs in parallel would speed things up a lot.
The same applies to the downloads as well.

Write ansible playbook for setting up a secured rsync server

What we need:

  • A ssh user that can only invoke a sandboxed rsync. rsync should be constrained to only be able to read / write one directory. This can happen using bubblewrap / firejail etc.
  • Optional: read-only rsync server (non-ssh) to sync files from, so other people can easily replicate our setup with efficient data updates.

Error fetching 'feeds/au.json'

An error occured while fetching a feed from feeds/au.json.
If the error is not temporary, please consider replacing or removing the feed.
Thanks!

Here are the logs of the error(s):
On 2024-03-25 00:39:50 UTC:

Fetching au-Translink-SEQ…
Fetching au-Translink-BOW…
Fetching au-Translink-BOW…
Fetching au-Translink-BUN…
Fetching au-Translink-CNS…
Fetching au-Translink-CNS…
Fetching au-Translink-GLT…
Fetching au-Translink-GYM…
Fetching au-Translink-INN…
Fetching au-Translink-INN…
Fetching au-Translink-KIL…
Fetching au-Translink-MKY…
Fetching au-Translink-MAG…
Fetching au-Translink-MIF…
Fetching au-Translink-MAL…
Fetching au-Translink-MHB…
Fetching au-Translink-MHB…
Fetching au-Translink-NSI…
Fetching au-Translink-NSI…
Fetching au-Translink-RKY…
Fetching au-Translink-TWB…
Fetching au-Translink-WAR…
Fetching au-Translink-WHT…
Fetching au-Transperth…
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in _new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 96, in create_connection
    raise err
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 86, in create_connection
    sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 704, in urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 399, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 239, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/usr/lib/python3.11/http/client.py", line 1282, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.11/http/client.py", line 1328, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.11/http/client.py", line 1277, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.11/http/client.py", line 1037, in _send_output
    self.send(msg)
  File "/usr/lib/python3.11/http/client.py", line 975, in send
    self.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 205, in connect
    conn = self._new_conn()
           ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 179, in _new_conn
    raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPConnection object at 0x7f5e8ccad190>, 'Connection to www.transperth.wa.gov.au timed out. (connect timeout=None)')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 489, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 788, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='www.transperth.wa.gov.au', port=80): Max retries exceeded with url: /TimetablePDFs/GoogleTransit/Production/google_transit.zip (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f5e8ccad190>, 'Connection to www.transperth.wa.gov.au timed out. (connect timeout=None)'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/woodpecker/src/github.com/public-transport/transitous/./src/fetch.py", line 191, in <module>
    fetcher.fetch(Path(metadata_file))
  File "/woodpecker/src/github.com/public-transport/transitous/./src/fetch.py", line 172, in fetch
    new_data = self.fetch_source(download_path, source)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/woodpecker/src/github.com/public-transport/transitous/./src/fetch.py", line 71, in fetch_source
    requests.head(download_url, headers=source.options.headers, allow_redirects=True).headers
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/api.py", line 100, in head
    return request("head", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 587, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 701, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 553, in send
    raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPConnectionPool(host='www.transperth.wa.gov.au', port=80): Max retries exceeded with url: /TimetablePDFs/GoogleTransit/Production/google_transit.zip (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f5e8ccad190>, 'Connection to www.transperth.wa.gov.au timed out. (connect timeout=None)'))

MOTIS: Increase feed limit

source_idx_t in nigiri/types.h limits the number of feeds we can have.

This variable needs to be made larger before we reach 256 feeds.

GTFS-RT realtime update fetching

That's not done by MOTIS itself, it expects downloads to happen externally and then be notified about new data. That makes sense given the access to those feeds isn't standardized and can e.g. need additional authorization headers.

So we also need to build that, basically https://github.com/motis-project/motis/blob/master/modules/ris/crawl_swiss_gtfs_rt.sh in big:

  • configured by the same form of per-feed metadata as used for the static data
  • needs to run at a high-frequency (~60secs), so probably has to be on the MOTIS machine itself
  • given the amount of feeds some will likely not always respond promptly, so this needs to be happen parallelized and with aggressive timeouts.
  • this needs some form of monitoring and error reporting for persistent feed update failures
  • needs to make sure that MOTIS doesn't get confused by partial/still ongoing downloads, e.g. by making fully downloaded files available atomically by a rename.

test-import git action expected deleted feed file

In #111 I deleted the ca.json feed to split it into multiple separate feeds but this resulted in the ci attempting to find the now deleted file:

Traceback (most recent call last):
  File "/transitous/./src/fetch.py", line 183, in <module>
    fetcher.fetch(Path(metadata_file))
  File "/transitous/./src/fetch.py", line 155, in fetch
    region = Region(json.load(open(metadata, "r")))
                              ^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'feeds/ca.json'
Traceback (most recent call last):
  File "/transitous/ci/fetch-feeds.py", line 43, in <module>
    subprocess.check_call(["./src/fetch.py", feed])
  File "/usr/lib/python3.11/subprocess.py", line 413, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['./src/fetch.py', 'feeds/ca.json']' returned non-zero exit status 1.

Write specialized reverse-proxy

This reverse proxy should only let through requests that:

It should also rewrite all the json before sending it to motis.

My preferred stack for this is Rocket / Serde.

Script upstreaming feeds to transitland-atlas

We need a few more fields than transitland-atlas, relating to our specific setup (for example fetch-interval-days and fix), but we should be able to generate files that contain the minimum needed for transitland-atlas from our files.

Generate license page

We can now generate a license summary as json using ./src/generate-attribution.py.
In order to make it easy for apps to link to it, we should set up a static site generator to make this data more readable for humans.

Make fetch failures non-fatal and spam maintainers instead

Requirements:

  • A GitHub issue with, depending on the error the http body and error code, or the gtfstidy output should be opened.
  • There should be at most one issue per failing feed. The easiest way to do that is probably prefixing the title with the feed id.
  • It should not depend on GitHub Actions specifically, we are moving away from that.
  • The feed maintainer (as specified in the json) should be assigned or mentioned in the issue.

This needs to be implemented in src/fetch.py, but the GitHub specific parts can of course be in a new file.

[Spain] Add GTFS feeds from NAP

The Spanish NAP (National Access Point) is available here https://nap.mitma.es/. It has feeds from operator companies and regional governments mixed, so some preprocessing to avoid duplicates may be needed. The NAP only has the feeds from the companies and governments who want to release them, having therefore gaps in the coverage.

Address autocompletion

By default this is handled by the address module of MOTIS. That however gets OOM-killed when importing the European OSM dataset.

In particular this happens here: https://github.com/motis-project/address-typeahead/blob/b6b5e60faac2921f1c0b3813da9c11731a3ca31d/src/extractor.cc#L404

NodeLocationsForWays's use of osmium::index::map::FlexMem shows up in Heaptrack as the main cost when testing with the CH subset, however still in "sparse" mode of FlexMem there (due to the dataset being smaller). When explicitly forcing "dense" mode (which is expected to be used for the EU dataset) even the CH subset runs out of memory (assuming I did that correctly).

Possible scenarios/approaches:

(1) Disable address autocompletion.

Limited loss of features in the web ui, but not ideal.

(2) Bug in FlexMem use of https://github.com/motis-project/address-typeahead

It's quite possible nobody has tried to load such a large OSM subset into this, and thus even realistic to fix/optimize issues have remained in that code path. Needs further investigation, but if that's the case would allow to enable this feature without requiring extra work or setup.

(3) External address autocompleter

Possible alternatives exist, such as Nominatim (OSM's default, supports incremental updates and proven to scale to the full planet dataset), or Photon (used by some Digitransit/OTP installations AFAIK). Integration into MOTIS would need work but doesn't appear too difficult.

Add more pull request linting

We have a few, partially unwritten rules about files. The src/fetch.py script should validate them.

  • No spaces in feed names
  • No _ in feed names
  • No file system relevant things in feed names (/, ../ etc.)

Error fetching 'feeds/ca-bc.json'

An error occured while fetching a feed from feeds/ca-bc.json.
If the error is not temporary, please consider replacing or removing the feed.
Thanks!

Here are the logs of the error(s):
On 2024-03-25 00:37:54 UTC:

Fetching ca-bc-Black-Ball-Ferry-Line…
Fetching ca-bc-Campbell-River-Transit…
Postprocessing ca-bc-Campbell-River-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Campbell-River-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Campbell-River-Transit.gtfs.zip'... done.

Fetching ca-bc-Campbell-River-Transit…
Fetching ca-bc-Comox-Valley-Transit…
Fetching ca-bc-Comox-Valley-Transit…
Fetching ca-bc-Cowichan-Valley-Transit…
Postprocessing ca-bc-Cowichan-Valley-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Cowichan-Valley-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Cowichan-Valley-Transit.gtfs.zip'... done.

Fetching ca-bc-Cowichan-Valley-Transit…
Fetching ca-bc-Creston-Transit…
Postprocessing ca-bc-Creston-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Creston-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Creston-Transit.gtfs.zip'... done.

Fetching ca-bc-Creston-Transit…
Fetching ca-bc-Dawson-Creek-Transit…
Postprocessing ca-bc-Dawson-Creek-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Dawson-Creek-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Dawson-Creek-Transit.gtfs.zip'... done.

Fetching ca-bc-Dawson-Creek-Transit…
Fetching ca-bc-East-Kootenay-Transit…
Postprocessing ca-bc-East-Kootenay-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_East-Kootenay-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_East-Kootenay-Transit.gtfs.zip'... done.

Fetching ca-bc-East-Kootenay-Transit…
Fetching ca-bc-Fort-St-John-Transit…
Postprocessing ca-bc-Fort-St-John-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Fort-St-John-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Fort-St-John-Transit.gtfs.zip'... done.

Fetching ca-bc-Fort-St-John-Transit…
Fetching ca-bc-Fraser-Valley-Transit…
Postprocessing ca-bc-Fraser-Valley-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Fraser-Valley-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Fraser-Valley-Transit.gtfs.zip'... done.

Fetching ca-bc-Fraser-Valley-Transit…
Fetching ca-bc-Kamloops-Transit…
Fetching ca-bc-Kamloops-Transit…
Fetching ca-bc-Nanaimo-Transit…
Fetching ca-bc-Nanaimo-Transit…
Fetching ca-bc-North-Okanagan-Transit…
Postprocessing ca-bc-North-Okanagan-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_North-Okanagan-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_North-Okanagan-Transit.gtfs.zip'... done.

Fetching ca-bc-North-Okanagan-Transit…
Fetching ca-bc-Port-Alberni-Transit…
Postprocessing ca-bc-Port-Alberni-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Port-Alberni-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Port-Alberni-Transit.gtfs.zip'... done.

Fetching ca-bc-Port-Alberni-Transit…
Fetching ca-bc-Powell-River-Transit…
Postprocessing ca-bc-Powell-River-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Powell-River-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Powell-River-Transit.gtfs.zip'... done.

Fetching ca-bc-Powell-River-Transit…
Fetching ca-bc-Prince-George-Transit…
Postprocessing ca-bc-Prince-George-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Prince-George-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Prince-George-Transit.gtfs.zip'... done.

Fetching ca-bc-Prince-George-Transit…
Fetching ca-bc-Prince-Rupert-Transit…
Postprocessing ca-bc-Prince-Rupert-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Prince-Rupert-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Prince-Rupert-Transit.gtfs.zip'... done.

Fetching ca-bc-Prince-Rupert-Transit…
Fetching ca-bc-South-Okanagan-Transit…
Postprocessing ca-bc-South-Okanagan-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_South-Okanagan-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_South-Okanagan-Transit.gtfs.zip'... done.

Fetching ca-bc-South-Okanagan-Transit…
Fetching ca-bc-Squamish-Transit…
Fetching ca-bc-Squamish-Transit…
Fetching ca-bc-Sunshine-Coast-Transit…
Postprocessing ca-bc-Sunshine-Coast-Transit with gtfstidy…
Parsing GTFS feed in '/woodpecker/src/github.com/public-transport/transitous/downloads/ca-bc_Sunshine-Coast-Transit.gtfs.zip' ... done.
Outputting GTFS feed to '/woodpecker/src/github.com/public-transport/transitous/out/ca-bc_Sunshine-Coast-Transit.gtfs.zip'... done.

Fetching ca-bc-Sunshine-Coast-Transit…
Fetching ca-bc-TransLink…
Fetching ca-bc-TransLink…
Fetching ca-bc-Victoria-Transit…
Fetching ca-bc-Victoria-Transit…
Fetching ca-bc-West-Kootenay-Transit…
Error: Could not fetch file:
Status Code: 500 Body: b''

MOTIS: Fix crash

Mär 07 19:54:20 vm-MOTIS motis[26967]: 2024-03-07T18:54:20Z [VERIFY FAIL] no first valid found: ICE 77 (dbg=ch_opentransportdataswiss.gtfs.zip/stop_times.txt:75467:75469)
Mär 07 19:54:20 vm-MOTIS motis[26967]: terminate called after throwing an instance of 'std::runtime_error'
Mär 07 19:54:20 vm-MOTIS motis[26967]:   what():  no first valid found: ICE 77 (dbg=ch_opentransportdataswiss.gtfs.zip/stop_times.txt:75467:75469)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.