Code Monkey home page Code Monkey logo

openaq-api-v2's People

Contributors

bitner avatar caparker avatar chagerba avatar majesticio avatar russbiggs avatar sruti avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openaq-api-v2's Issues

v3 providers and countries

To make the AWS open data bucket a bit easier to navigate I think there should be a way to get a list of countries by provider since the folder structure goes:

/provider={provider_name}/country={country_code}

We could either provider a query parameter or subresource
e.g.
countries?providers_id=42

or
countries/42/providers (this option feels odd to me))

The other option would be include an array of countries in the providers response object

"countries": ["us", "uk", "gh"]

Which is similar to the existing bounds key but a bit more descriptive.

There will likely also be an inverse situation where a user is interested in a country but first must know the providers that cover that country. we currently provide a providersCount key in the country response. Should we also include a providers array? or provide separate endpoint based solutions similar to the kind of ideas above?

Any thoughts @caparker @majesticio

/v2/measurements - order by and sort not working

The 'sort' of ASC or DESC does not work. And, 'order by' also does not work.

In the following example, PM2.5 results start from date_from followed by O3 results also starting from date_from.

https://api.openaq.org/v2/measurements?location_id=2211&parameter_id=2&date_from=2023-06-07T14:12:39-04:00&date_to=2023-06-08T14:12:39-04:00&limit=100&sort=desc&order_by=datetime

My Python is not great, but I see 'order_by' and 'sort' is not being used in the SQL statement in the source code (measurements.py)

Thanks

bounding box queries

it woudl be nice to have bounding box queries in addition to the lat,lng + radius option

msg":"none is not an allowed value","type":"type_error.none.not_allowed"

My queries were working a few days ago but now I'm getting 422 errors on requests.

curl https://api.openaq.org/v2/latest\?limit\=100\&page\=1\&offset\=0\&sort\=desc\&coordinates\=52.45355235%2C-1.73395439\&radius\=50000\&order_by\=lastUpdated\&dumpRaw\=false

yields:

{"detail":[{"loc":["results","3","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","10","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","11","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","14","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","15","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"}]}%

AttributeError: module 'pandas.io.json' has no attribute 'json_normalize'

Summary

decorators.py uses the access pandas.io.json for the function json_normalize but the function cannot be accessed this way.

When I change the code by directly calling json_normalize like so:

import pandas as pd

pd.json_normalize()

it works.

Error Output

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
File [.../decorators.py:57), in pandasize..decorator..decorated_function(*args, **kwargs)
     53             d.append(tmp)
     55         resp = d
---> 57 data = pd.io.json.json_normalize(resp)
     59 # If there are any datetimes, make them datetimes!
     60 for each in [i for i in data.columns if 'date' in i]:

AttributeError: module 'pandas.io.json' has no attribute 'json_normalize'

/v2/measurements returning more results than limit

Hello,

I am using the v2 api to collect historical measurements in a specific area. For some location_ids (e.g. 8567) the API returns a greater number of results than the limit. For example, when the limit is 5, 8 results are returned.

https://api.openaq.org/v2/measurements?date_from=2000-01-01T00%3A00%3A00%2B00%3A00&date_to=2022-09-29T15%3A10%3A00%2B00%3A00&limit=5&page=1&offset=0&sort=desc&parameter=2&location_id=8567&order_by=datetime

Then the second page (page=2) returns no results, even though many measurements are found:
{"meta":{"name":"openaq-api","license":"CC BY 4.0d","website":"api.openaq.org","page":2,"limit":5,"found":61293},"results":[]}

However, for other location_ids, such as 14, the API behaves as expected, returning 5 results on both page 1 and 2.

This seems to happen in both v1 and v2

Thanks

devcontainer/Docker does not build on OSX

Building the local development version fails with on OSX. It successfully builds and runs on linux. Error seems to be around building jq wheel in the pip install process.

#17 118.8 ERROR: Could not build wheels for jq, which is required to install pyproject.toml-based projects

Might be connected to the slim-buster image

include_fields averagingPeriod with json

I am currently unable to get averagingPeriod with v2 and output json. I believe that this was previously operational.

For replication, I am copying the URLs built with the "try it" documentation interface (https://api.openaq.org/docs). Each is designed to output one (the same) measurement. Both request averagingPeriod, but only v1 receives it.

Version 1

https://api.openaq.org/v1/measurements?format=json&date_from=2022-10-01T00%3A00%3A00%2B00%3A00&date_to=2022-10-03T18%3A03%3A00%2B00%3A00&limit=1&page=1&offset=0&sort=desc&parameter=pm25&radius=1000&country_id=US&order_by=datetime&isMobile=false&isAnalysis=false&entity=government&sensorType=reference%20grade&include_fields=averagingPeriod

{
  "meta": {
    "name": "openaq-api",
    "license": "CC BY 4.0d",
    "website": "api.openaq.org",
    "page": 1,
    "limit": 1,
    "found": 73204
  },
  "results": [
    {
      "location": "St. Maries",
      "parameter": "pm25",
      "value": 10,
      "date": {
        "utc": "2022-10-03T16:00:00Z",
        "local": "2022-10-03T09:00:00-07:00"
      },
      "unit": "µg/m³",
      "coordinates": {
        "latitude": 47.3167,
        "longitude": -116.570297
      },
      "country": "US",
      "city": "BENEWAH",
      "averagingPeriod": {
        "unit": "seconds",
        "value": 3600
      }
    }
  ]
}

Version 2

https://api.openaq.org/v2/measurements?format=json&date_from=2022-10-01T00%3A00%3A00%2B00%3A00&date_to=2022-10-03T18%3A03%3A00%2B00%3A00&limit=1&page=1&offset=0&sort=desc&parameter=pm25&radius=1000&country_id=US&order_by=datetime&isMobile=false&isAnalysis=false&entity=government&sensorType=reference%20grade&include_fields=averagingPeriod

{
  "meta": {
    "name": "openaq-api",
    "license": "CC BY 4.0d",
    "website": "api.openaq.org",
    "page": 1,
    "limit": 1,
    "found": 73204
  },
  "results": [
    {
      "locationId": 1774,
      "location": "St. Maries",
      "parameter": "pm25",
      "value": 10,
      "date": {
        "utc": "2022-10-03T16:00:00+00:00",
        "local": "2022-10-03T09:00:00-07:00"
      },
      "unit": "µg/m³",
      "coordinates": {
        "latitude": 47.3167,
        "longitude": -116.570297
      },
      "country": "US",
      "city": "BENEWAH",
      "isMobile": false,
      "isAnalysis": false,
      "entity": "government",
      "sensorType": "reference grade"
    }
  ]
}

Thanks for any advice.

Averages response's date field is wrong

Current behavior

The date field (as seen in the beta version) is no longer there.

Instead you see year, month, or day, depending on the temporal query parameter.

And the value of that field is always in the format YYYY-MM-DD.

Part of the response:

{
    "id": 58364,
    "name": "MN",
    "unit": "µg/m³",
    "year": "2021-01-01",
    "average": 91.8671,
    "subtitle": "Mongolia",
    "parameter": "pm25",
    "displayName": "PM2.5",
    "parameterId": 2,
    "measurement_count": 277960
}

Expected behavior

The date field in the response should stay as date, regardless of temporal parameter.

The date format should correspond to the temporal parameter. For example:

  • For year it should be "date": "2021"
  • For month it should be "date": "2021-01"
  • For day it should be "date": "2021-01-01"

How to reproduce

Temporal parameter: Year
https://api.openaq.org/v2/averages?spatial=country&temporal=year&country=MN&parameter=pm25

Temporal parameter: Month
https://api.openaq.org/v2/averages?spatial=country&temporal=month&country=MN&parameter=pm25

Temporal parameter: Day
https://api.openaq.org/v2/averages?spatial=country&temporal=day&country=MN&parameter=pm25

:bug: Cannot copy staticfiles in docker-compose installation of API

Any dependencies related to the completion of the task (include links to tickets with the dependency).

  • Docker version 20.10.8

I added additional environmental variables into the docker-compose.yml in order for the build to pass:

      - DATABASE_READ_USER=postgres
      - DATABASE_READ_PASSWORD=postgres
      - DATABASE_WRITE_USER=postgres
      - DATABASE_WRITE_PASSWORD=postgres
      - DATABASE_HOST=172.17.0.2
      - DATABASE_PORT=5432
      - DATABASE_DB=postgres
      - DOMAIN_NAME="api.openaq.org"

Summary of identified tasks related to the issue

I am trying to build the db and the openaqapi locally via the instructions provided in the "Historic Method" section of the readme.

After running the following:

cd .devcontainer
docker-compose build
docker-compose up

The API docker container keeps exiting with:

RuntimeError: Directory '/usr/local/lib/python3.9/site-packages/openaq_fastapi/static' does not exist

I have tried the following:

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)


logger.info("relative path of static folder: %s", static_dir)

this returns:
relative path of static folder: /usr/local/lib/python3.9/site-packages/openaq_fastapi/static

After commenting out the L159 in openaq-api-v2/openaq_fastapi/openaq_fastapi/main.py I get the following traceback:

api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 691, in _create_ssl_connection
api_1  |     tr, pr = await loop.create_connection(
api_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1041, in create_connection
api_1  |     sock = await self._connect_sock(
api_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 955, in _connect_sock
api_1  |     await self.sock_connect(sock, address)
api_1  |   File "/usr/local/lib/python3.9/asyncio/selector_events.py", line 502, in sock_connect
api_1  |     return await fut
api_1  | asyncio.exceptions.CancelledError
api_1  | 
api_1  | During handling of the above exception, another exception occurred:
api_1  | 
api_1  | Traceback (most recent call last):
api_1  |   File "/usr/local/lib/python3.9/asyncio/tasks.py", line 492, in wait_for
api_1  |     fut.result()
api_1  | asyncio.exceptions.CancelledError
api_1  | 
api_1  | The above exception was the direct cause of the following exception:
api_1  | 
api_1  | Traceback (most recent call last):
api_1  |   File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 635, in lifespan
api_1  |     async with self.lifespan_context(app):
api_1  |   File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 530, in __aenter__
api_1  |     await self._router.startup()
api_1  |   File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 612, in startup
api_1  |     await handler()
api_1  |   File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/main.py", line 113, in startup_event
api_1  |     app.state.pool = await db_pool(None)
api_1  |   File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/db.py", line 47, in db_pool
api_1  |     pool = await asyncpg.create_pool(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 413, in _async__init__
api_1  |     await self._initialize()
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 441, in _initialize
api_1  |     await first_ch.connect()
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 133, in connect
api_1  |     self._con = await self._pool._get_new_connection()
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 511, in _get_new_connection
api_1  |     con = await connection.connect(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connection.py", line 2085, in connect
api_1  |     return await connect_utils._connect(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 895, in _connect
api_1  |     raise last_error
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 881, in _connect
api_1  |     return await _connect_addr(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 781, in _connect_addr
api_1  |     return await __connect_addr(params, timeout, True, *args)
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 825, in __connect_addr
api_1  |     tr, pr = await compat.wait_for(connector, timeout=timeout)
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/compat.py", line 66, in wait_for
api_1  |     return await asyncio.wait_for(fut, timeout)
api_1  |   File "/usr/local/lib/python3.9/asyncio/tasks.py", line 494, in wait_for
api_1  |     raise exceptions.TimeoutError() from exc
api_1  | asyncio.exceptions.TimeoutError
api_1  | 
api_1  | [2022-06-12 15:19:56,665] ERROR [uvicorn.error:119] Traceback (most recent call last):
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 691, in _create_ssl_connection
api_1  |     tr, pr = await loop.create_connection(
api_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1041, in create_connection
api_1  |     sock = await self._connect_sock(
api_1  |   File "/usr/local/lib/python3.9/asyncio/base_events.py", line 955, in _connect_sock
api_1  |     await self.sock_connect(sock, address)
api_1  |   File "/usr/local/lib/python3.9/asyncio/selector_events.py", line 502, in sock_connect
api_1  |     return await fut
api_1  | asyncio.exceptions.CancelledError
api_1  | 
api_1  | During handling of the above exception, another exception occurred:
api_1  | 
api_1  | Traceback (most recent call last):
api_1  |   File "/usr/local/lib/python3.9/asyncio/tasks.py", line 492, in wait_for
api_1  |     fut.result()
api_1  | asyncio.exceptions.CancelledError
api_1  | 
api_1  | The above exception was the direct cause of the following exception:
api_1  | 
api_1  | Traceback (most recent call last):
api_1  |   File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 635, in lifespan
api_1  |     async with self.lifespan_context(app):
api_1  |   File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 530, in __aenter__
api_1  |     await self._router.startup()
api_1  |   File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 612, in startup
api_1  |     await handler()
api_1  |   File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/main.py", line 113, in startup_event
api_1  |     app.state.pool = await db_pool(None)
api_1  |   File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/db.py", line 47, in db_pool
api_1  |     pool = await asyncpg.create_pool(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 413, in _async__init__
api_1  |     await self._initialize()
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 441, in _initialize
api_1  |     await first_ch.connect()
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 133, in connect
api_1  |     self._con = await self._pool._get_new_connection()
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 511, in _get_new_connection
api_1  |     con = await connection.connect(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connection.py", line 2085, in connect
api_1  |     return await connect_utils._connect(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 895, in _connect
api_1  |     raise last_error
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 881, in _connect
api_1  |     return await _connect_addr(
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 781, in _connect_addr
api_1  |     return await __connect_addr(params, timeout, True, *args)
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 825, in __connect_addr
api_1  |     tr, pr = await compat.wait_for(connector, timeout=timeout)
api_1  |   File "/usr/local/lib/python3.9/site-packages/asyncpg/compat.py", line 66, in wait_for
api_1  |     return await asyncio.wait_for(fut, timeout)
api_1  |   File "/usr/local/lib/python3.9/asyncio/tasks.py", line 494, in wait_for
api_1  |     raise exceptions.TimeoutError() from exc
api_1  | asyncio.exceptions.TimeoutError
api_1  | 
api_1  | ERROR:    Application startup failed. Exiting.
api_1  | [2022-06-12 15:19:56,665] ERROR [uvicorn.error:56] Application startup failed. Exiting.

Not sure what is up with the COPY of the staticfile. Thanks!

API return negative counts

some API queries are returning negative numbers in the meta count field.

e.g.

/v2/measurements?date_from=2020-10-20T00%3A00%3A00%2B00%3A00&date_to=2022-11-21T16%3A49%3A00%2B00%3A00&limit=2000&page=1&offset=0&sort=desc&radius=1000&city=Chicago&order_by=datetime

returns:

{
  "meta": {
    "name": "openaq-api",
    "license": "CC BY 4.0d",
    "website": "test.openaq.org",
    "page": 1,
    "limit": 2000,
    "found": -8371
  },
  "results": []
}

Not all data is shown

When taking for example the following query on the API: https://api.openaq.org/v2/measurements?country=LU&city=Luxemburg&location=LU0102A&parameter=o3&date_to=2021-05-10&date_from=2021-05-08, only 21 data points are shown in the output, but according to the meta section it should include 107 data points:

"meta": {
"name": "openaq-api",
"license": "CC BY 4.0d",
"website": "https://u50g7n0cbj.execute-api.us-east-1.amazonaws.com/",
"page": 1,
"limit": 100,
"found": 107
}

Not sure where the 107 comes from, this seems to be faulty as well as in my opinion there should be one data point per hour, so the result should contain 48 data points.

I have the same for almost every other day that some of the hourly data points are not returned.

What am I missing here?

Documentation seems to not specify format of returned data

It seems that the api docs located at https://docs.openaq.org/ don't include any schemas for returned data.
Each response links to a relevant schema for successful returns but these seem to only include the meta fields and the result field is either left blank or left as an array with no details of its content.
Am I missing something here or is something broken.

V3 license fields

I like the schema used by OpenAddresses for licensing info here:

https://github.com/openaddresses/openaddresses/blob/master/CONTRIBUTING.md#optional-address-tags

I think we could return a JSON object very similar to this from the providers endpoint. From the database side I'm thinking we may want the option to support multiple licenses per provider (or project). This could be relevant for when a license changes over time, e.g. Provider A had license CC by 4.0 2012-2016 and CC by 1.0 2016-. I would think a separate DB table licenses with columns mapping to those listed on openadresses would facilitate this. @caparker

Unclear documentation on creating environment variables

I was wondering how about mentioning what values are to be substituted and how to create them or find them for each environment variable while setting it up locally. And if any other repo is needed or used to setup openaq-api-v2 locally. It will help beginners.

v2/locations/tiles calls and cors

I know it's quite new api and probably far to be finished, but I was playing with it and the tile calls end up being rejected becuase of the cors policy.
rest of the calls seem to work fine.

Pydantic for query parameters

using Pydantic for validating and build the query parameter objects has been causing some issues mostly around @validator steps. Rasing a ValueError results in a HTTP 500 instead of a HTTP 422. This has been encountered by other users e.g.:

fastapi/fastapi#2180

One solution is to raise a FastAPI HTTPException as discussed in the issue, but this has some reusability issues it seems.

This issue:

fastapi/fastapi#1474 (comment)

Seems to present some other solutions that may be worth exploring

nodes.py unused

The entire nodes.py file seems to be unused. Let's double check usage and possible future utility, if not relevant we can remove.

/v1/cities - new schema retrofit

update the /v1/cities endpoint to match the v1 schema and fully deprecate name property:

{
    "$schema": "http://json-schema.org/draft-07/schema#",
    "title": "Generated schema for Root",
    "type": "object",
    "properties": {
      "country": {
        "type": "string"
      },
      "city": {
        "type": "string"
      },
      "count": {
        "type": "number"
      },
      "locations": {
        "type": "number"
      }
    },
    "required": [
      "country",
      "city",
      "count",
      "locations"
    ],
    "additionalProperties": false
  }

Month and annual rollups seem to be taking averages of averages

This may not be an issue, depending on what you are going for but I thought I would point it out. Based on what I can see in the file with the rollup methods it looks like the daily rollups are done directly on the measurement table but then the monthly and annual rollups are done on the daily and monthly values respectively. This would give you:

  • daily average of measurements (not sure the interval)
  • monthly averages of daily averages and
  • annual averages of the monthly averages

But not the daily, monthly and annual averages of the measurements.
Which is fine of course, if that is what you were after, but I couldn't find anything that stated what specifically the aggregate values were expected to be and so I wanted to point that out.

API Docs need to be updated

Since moving to this new version of the API, some of the information that existed on the old API docs did not make it over.

We need to update the current docs to make sure the API is easy to use:

  • Remove default section
  • Add introduction (what's in the old docs is a good starting point, though some information needs to be updated)
  • Document all v2 endpoints (some are new or have new functionality). Same as above, test it out!
  • Document all v1 endpoints, with same or updated information from the old docs . Test to make sure the API works as the docs say. Log a bug in a separate issue if not.
  • Cleanup and clarify API parameters. When you 'Try it out' sometimes it's unclear what the parameters are or what the input should be
  • Remove Schemas section or provide more documentation about it

The cool thing about the Swagger docs is that they are created directly from the API code. For example, to modify the cities endpoint description you would start at the city router code. Also check out the Swagger documentation for more info.

v2/averages is not working

I am receiving "Internal Server Error" when trying to get averages using country code "DE".
Is it a bug? Are there any plans to fix it?

pairing locations with averages

The v2/averages API is extremely useful. I am able to ask for monthly location average to get a historic view of a bunch of locations. I am using the temporal=month and spatial=location. So, I can get a year of location monthly averages for a parameter quickly (about 40s) -- this is awesome!

Once I have the averages, I want to get the coordinates and metadata of the locations. The name field seems to correspond to the locations.id, which generally makes sense. Right now, I use the v2/location API to find all locations that measure the parameter. Then, I can join averages based on averages.name == locations.id. Getting the location data takes about 6 minutes, which I suspect is limited by the fact that it returns the latest observations.

It seems odd that I can get the averages for a year faster than I can get the coordinates. Any recommendations on how to get the coordinates and metadata faster?

p.s., In addition to the speed, I suspect I am putting unnecessary pressure on the server.

api error for a specific record

I am running into an issue where the averages interface works for most requests, but fails for a very specific record. I started by paging with a limit of 1000 per page. The second page failed, but the rest did not. I then changed the page size (500, 250, 50, etc). I am only able to reproduce the problem with page sizes of 3 or more. The URL below shows the problem with just 3 records.

https://api.openaq.org/v2/averages?date_from=2020-01-01&date_to=2020-12-31&parameter=pm25&limit=3&page=594&offset=0&sort=desc&spatial=location&temporal=month&group=false

Returns

{"message":"internal server error"}

Note that the same URL with page=593 or page=595 works. The version below reproduces the problem with 1000 items per page.

https://api.openaq.org/v2/averages?date_from=2020-01-01&date_to=2020-12-31&parameter=pm25&limit=1000&page=2&offset=0&sort=desc&spatial=location&temporal=month&group=false

Returns

{"message":"internal server error"}

Note that the same url with page=1 or page=3 works just fine. I don't know if this is related to a corrupt record or the API itself.

Bug: Pagination not working or found meta parameter incorrect

If I understand it correctly a response like this shouldn't be possible:

{
    "meta": {
        "name": "openaq-api",
        "license": "CC BY 4.0d",
        "website": "api.openaq.org",
        "page": 2,
        "limit": 10000,
        "found": 2216436
    },
    "results": []
}

This result is produced by following this link https://api.openaq.org/v2/measurements?coordinates=42.698029%2C23.322718&radius=10000&page=2&limit=10000&parameter=pm25&order_by=datetime&sort=asc

I dont know whether the found parameter is wrong or pagination is working?

/measurements API not returning more than 10,000 results for reference-grade data

When trying to use /measurements to get more than ~10,000 results like so:
(v1) https://api.openaq.org/v1/measurements?limit=6000&page=8
(v2) https://api.openaq.org/v2/measurements?limit=6000&page=4&sensorType=reference%20grade

the API returns the following error:

message: "Service Unavailable"

When an API request with just the limit parameter is used like so
https://api.openaq.org/v2/measurements?sensorType=reference%20grade&limit=10000
it works fine but not when the limit is increased to 20000. Then we just get

detail : ""

Perhaps DB is timing out?

v2/latest/{location_id} - Parameter ID and Parameter not working

In my case, I can't get 'parameter_id' or 'parameter' to work. I am trying to get PM2.5 but also getting O3. I was ok with that initially, but the order in which they appear varies. Sometimes PM2.5 is first, and others, it is O3.

Here are my queries

https://api.openaq.org/v2/latest/2211?limit=1&page=1&offset=0&sort=desc&parameter_id=2&radius=1000&order_by=lastUpdated

https://api.openaq.org/v2/latest/2211?limit=1&page=1&offset=0&sort=desc&parameter_id=2&parameter=pm25&radius=1000&order_by=lastUpdated

https://api.openaq.org/v2/latest/2211?limit=1&page=1&offset=0&sort=desc&parameter=pm25&radius=1000&order_by=lastUpdated

I have a similar issue with /v2/measurements which I will log separately.

Thanks.

`order_by=distance` removed and `coordinates` no longer works

Not sure if the former part is actually a bug, but I couldn't find any information on it. Was support for order_by=distance removed?

Also, I can no longer get any query involving coordinates to work. I've tried using the UI here too: https://docs.openaq.org/#/v1/locationsv1_get_v1_locations_get.

Sample URL which used to work:

https://api.openaq.org/v1/locations?coordinates=35.8001%2C-78.7119&limit=5&order_by=distance

If you remove order_by=distance it just gives an unspecified server error.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.