Code Monkey home page Code Monkey logo

datacube-ows's Introduction

datacube-ows

image

image

image

image

image

PyPI

Datacube Open Web Services

Datacube-OWS provides a way to serve data indexed in an Open Data Cube as visualisations, through open web services (OGC WMS, WMTS and WCS).

Features

  • Leverages the power of the Open Data Cube, including support for COGs on S3.
  • Fully supports WMS v1.3.0. Partial support (GetMap requests only) for v1.1.1.
  • Supports WMTS 1.0.0
  • Supports WCS versions 1.0.0, 2.0.0 and 2.1.0.
  • Richly featured styling engine for serving data visualisations via WMS and WMTS.

Community

This project welcomes community participation.

Join the ODC Slack if you need help setting up or using this project, or the Open Data Cube more generally. Conversation about datacube-ows is mostly concentrated in the Slack channel #wms.

Please help us to keep the Open Data Cube community open and inclusive by reading and following our Code of Conduct.

Setup

Datacube_ows (and datacube_core itself) has many complex dependencies on particular versions of geospatial libraries. Dependency conflicts are almost unavoidable in environments that also contain other large complex geospatial software packages. We therefore strongly recommend some kind of containerised solution and we supply scripts for building appropriate Docker containers.

Linting

flake8 . --exclude Dockerfile --ignore=E501 --select=F401,E201,E202,E203,E502,E241,E225,E306,E231,E226,E123,F811
isort --check --diff **/*.py
autopep8  -r  --diff . --select F401,E201,E202,E203,E502,E241,E225,E306,E231,E226,E123,F811

Configuration and Environment

The configuration file format for OWS is fully documented here.

And example configuration file datacube_ows/ows_cfg_example.py is also provided, but may not be as up-to-date as the formal documentation.

Environment variables that directly or indirectly affect the running of OWS are documented here.

Docker-Compose

setup env by export

We use docker-compose to make development and testing of the containerised ows images easier.

Set up your environment by creating a .env file (see below).

To start OWS with flask connected to a pre-existing database on your local machine: :

docker-compose up

The first time you run docker-compose, you will need to add the --build option: :

docker-compose up --build

To start ows with a pre-indexed database: :

docker-compose -f docker-compose.yaml -f docker-compose.db.yaml up

To start ows with db and gunicorn instead of flask (production) :

docker-compose -f docker-compose.yaml -f docker-compose.db.yaml -f docker-compose.prod.yaml up

The default environment variables (in .env file) can be overriden by setting local environment variables :

# Enable pydev for pycharm (needs rebuild to install python libs)
# hot reload is not supported, so we need to set FLASK_DEV to production
export PYDEV_DEBUG=yes
export FLASK_DEV=production
docker-compose -f docker-compose.yaml -f docker-compose.db.yaml up --build

setup env with .env file

cp .env_simple .env # for a single ows config file setup
cp .env_ows_root .env # for multi-file ows config with ows_root_cfg.py
docker-compose up

Docker

To run the standard Docker image, create a docker volume containing your ows config files and use something like: :

docker build --tag=name_of_built_container .

docker run --rm \
      -e DATACUBE_OWS_CFG=datacube_ows.config.test_cfg.ows_cfg   # Location of config object
      -e AWS_NO_SIGN_REQUEST=yes                                 # Allowing access to AWS S3 buckets
      -e AWS_DEFAULT_REGION=ap-southeast-2 \                     # AWS Default Region (supply even if NOT accessing files on S3! See Issue #151)
      -e SENTRY_DSN=https://[email protected]/projid \            # Key for Sentry logging (optional)
      -e DB_HOSTNAME=172.17.0.1 -e DB_PORT=5432 \                # Hostname/IP address and port of ODC postgres database
      -e DB_DATABASE=datacube \                                  # Name of ODC postgres database
      -e DB_USERNAME=cube -e DB_PASSWORD=DataCube \              # Username and password for ODC postgres database
      -e PYTHONPATH=/code                                        # The default PATH is under env, change this to target /code
      -p 8080:8000 \                                             # Publish the gunicorn port (8000) on the Docker
      \                                                          # container at port 8008 on the host machine.
      --mount source=test_cfg,target=/code/datacube_ows/config \ # Mount the docker volume where the config lives
      name_of_built_container

The image is based on the standard ODC container.

Installation with Conda ------------

The following instructions are for installing on a clean Linux system.

  • Create a conda python 3.8 and activate conda environment:

    conda create -n ows -c conda-forge python=3.8 datacube pre_commit postgis
    conda activate ows
  • install the latest release using pip install:

    pip install datacube-ows[all]
  • setup a database:

    pgdata=$(pwd)/.dbdata
    initdb -D ${pgdata} --auth-host=md5 --encoding=UTF8 --username=ubuntu
    pg_ctl -D ${pgdata} -l "${pgdata}/pg.log" start # if this step fails, check log in ${pgdata}/pg.log
    
    createdb ows -U ubuntu
  • enable postgis extension:

    psql -d ows
    create extension postgis;
    \q
  • init datacube and ows schema:

    export DATACUBE_DB_URL=postgresql:///ows
    datacube system init
    
    # to create schema, tables and materialised views used by datacube-ows.
    
    export DATACUBE_OWS_CFG=datacube_ows.ows_cfg_example.ows_cfg
    datacube-ows-update --role ubuntu --schema
  • Create a configuration file for your service, and all data products you wish to publish in it. Detailed documentation of the configuration format can be found here.
  • Set environment variables as required. Environment variables that directly or indirectly affect the running of OWS are documented here.
  • Run datacube-ows-update (in the Datacube virtual environment).
  • When additional datasets are added to the datacube, the following steps will need to be run:

    datacube-ows-update --views
    datacube-ows-update
  • If you are accessing data on AWS S3 and running datacube_ows on Ubuntu you may encounter errors with GetMap similar to: Unexpected server error: '/vsis3/bucket/path/image.tif' not recognized as a supported file format.. If this occurs run the following commands:

    mkdir -p /etc/pki/tls/certs
    ln -s /etc/ssl/certs/ca-certificates.crt /etc/pki/tls/certs/ca-bundle.crt
  • Launch flask app using your favorite WSGI server. We recommend using Gunicorn with either nginx or a load balancer.

The following approaches have also been tested:

Flask Dev Server

  • Good for initial dev work and testing. Not (remotely) suitable for production deployments.
  • cd to the directory containing this README file.
  • Set the FLASK_APP environment variable:

    export FLASK_APP=datacube_ows/ogc.py
  • Run the Flask dev server:

    flask run
  • If you want the dev server to listen to external requests (i.e. requests from other computers), use the --host option:

    flask run --host=0.0.0.0

Local Postgres database

  1. create an empty database and db_user
  2. run datacube system init after creating a datacube config file
  3. A product added to your datacube datacube product add url some examples are here: https://github.com/GeoscienceAustralia/dea-config/tree/master/products
  4. Index datasets into your product for example refer to https://datacube-ows.readthedocs.io/en/latest/usage.html

    aws s3 ls s3://deafrica-data/jaxa/alos_palsar_mosaic/2017/ --recursive \
    | grep yaml | awk '{print $4}' \
    | xargs -n1 -I {} datacube dataset add s3://deafrica-data/{}
  5. Write an ows config file to identify the products you want available in ows, see example here: https://github.com/opendatacube/datacube-ows/blob/master/datacube_ows/ows_cfg_example.py
  6. Run datacube-ows-update --schema --role <db_read_role> to create ows specific tables
  7. Run datacube-ows-update to generate ows extents.

Apache2 mod_wsgi

Getting things working with Apache2 mod_wsgi is not trivial and probably not the best approach in most circumstances, but it may make sense for you.

If you use the pip install approach described above, your OS's pre-packaged python3 apache2-mod-wsgi package should suffice.

  • Activate the wsgi module:
cd /etc/apache2/mods-enabled
ln -s ../mods-available/wsgi.load .
ln -s ../mods-available/wsgi.conf .
  • Add the following to your Apache config (inside the appropriate VirtualHost section):

    WSGIDaemonProcess datacube_ows processes=20 threads=1 user=uuu group=ggg maximum-requests=10000
    WSGIScriptAlias /datacube_ows /path/to/source_code/datacube-ows/datacube_ows/wsgi.py
    <Location /datacube_ows>
            WSGIProcessGroup datacube_ows
    </Location>
    <Directory /path/to/source_code/datacube-ows/datacube_ows>
            <Files wsgi.py>
                    AllowOverride None
                    Require all granted
            </Files>
    </Directory>

    Note that uuu and ggg above are the user and group of the owner of the Conda virtual environment.

  • Copy datacube_ows/wsgi.py to datacube_odc/local_wsgi.py and edit to suit your system.
  • Update the url in the configuration

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

datacube-ows's People

Contributors

alexgleith avatar andrewdhicks avatar benjimin avatar constantinius avatar dependabot[bot] avatar gypsybojangles avatar harshurampur avatar jacklidge avatar kirill888 avatar mmochan avatar mvaaltola avatar nikitagandhi avatar omad avatar pindge avatar pjonsson avatar pre-commit-ci[bot] avatar roarmstrong avatar rtaib avatar santoshamohan avatar spacemanpaul avatar tom-butler avatar uchchwhash avatar v0lat1le avatar valpesendorfer avatar whatnick avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datacube-ows's Issues

GetFeatureInfo pixel value is not consistant with GetMap representation for WOFLs

  • datacube-wms version: 0.9.0

Description

When there are overlapping WOFLs datasets for a single day, the result of GetFeatureInfo will/can return a different pixel value than what is represented in the GetMap response. I suspect this is because pixel fusing rules are not observed by GetFeatureInfo.

What I Did

To observe this behaviour:
https://nationalmap.gov.au/#share=s-1quhEDbI8X4Pw4OdcTxo2190uIk
Set GSKY "DEA Water Observation Feature Layer to 30% opacity to see where the two services are resolving pixels differently, click on one of the different pixels.
OWS will return a value in GetFeatureInfo consistant with the GSKY GetMap, not the OWS GetMap.
Let me know if that is insufficient information.

Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.

Datasets on or near the international dateline (e.g., Fiji) are not handled properly

  • datacube-wms version: latest Docker build on 21 May 2019
  • Python version: Python3
  • Operating System: Linux

Description

No data for Fiji is rendered.

WMS GetFeatureInfo returns a list of dates where data is available, and this is different to the list of data the GetCapabilities returns.

What I Did

  • Indexed USGS Landsat 8 Surface Reflectance for Fiji
  • Set up OWS on top of it
  • Noted that I can't render data for Fiji from OWS in Terria

WCS Multi-time NetCDF requests are broken

NetCDF can nominally store multiple time values in the one file, and it was originally intended that multi-time value WCS requests would be supported for the NetCDF format.

Such requests are currently broken. (See #32 for the exact error message)

The problem appears to be the group_datasets(... "by_solar_day") call. This results in a datasets being passed to read_data in a DataArray of DataArrays instead of a DataArray of Datasets, which causes an error to be thrown deep inside ODC.

WCS GetCoverage requests fail with format=netCDF

Description

GetCoverage requests with netCDF format is returning an error.

What I Did

Issued GetCoverage requests from a web browser, with the format set to netCDF, and received ServiceExceptionReport errors. Example request:

https://geomedian.services.dea.ga.gov.au/?SERVICE=WCS&VERSION=1.0.0&REQUEST=GetCoverage&COVERAGE=ls5_nbart_geomedian_annual&CRS=EPSG:4326&BBOX=133,-28,134,-27&FORMAT=netCDF&WIDTH=1000&HEIGHT=1000

The following ServiceExceptionReport is returned:

<?xml version='1.0' encoding="UTF-8"?>
<ServiceExceptionReport version="1.2.0"
xmlns="http://www.opengis.net/ogc"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wcs/1.0.0/OGC-exception.xsd">
    <ServiceException>Unexpected server error: &#39;DataArray&#39; object has no attribute &#39;uri_scheme&#39;</ServiceException>
    <ServiceException>
        <![CDATA[ <FrameSummary file /code/datacube_wms/ogc.py, line 59 in ogc_impl> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/wcs.py, line 23 in handle_wcs> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/wcs.py, line 105 in get_coverage> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/wcs_utils.py, line 344 in get_coverage_data> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/data.py, line 317 in data> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/data.py, line 157 in read_data> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/api/core.py, line 525 in load_data> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/api/core.py, line 440 in create_storage> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/api/core.py, line 440 in <listcomp>> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/api/core.py, line 501 in data_func> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/api/core.py, line 634 in _fuse_measurement> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/api/core.py, line 634 in <listcomp>> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/drivers/readers.py, line 92 in new_datasource> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/datacube/drivers/readers.py, line 71 in choose_datasource> ]]>
        <![CDATA[ <FrameSummary file /usr/local/lib/python3.6/dist-packages/xarray/core/common.py, line 176 in __getattr__> ]]>
    </ServiceException>
</ServiceExceptionReport>

Same requests with format set to GeoTIFF are successful, e.g.

https://geomedian.services.dea.ga.gov.au/?SERVICE=WCS&VERSION=1.0.0&REQUEST=GetCoverage&COVERAGE=ls5_nbart_geomedian_annual&CRS=EPSG:4326&BBOX=133,-28,134,-27&FORMAT=GeoTIFF&WIDTH=1000&HEIGHT=1000

AWS environment variables required by a purely local OWS instance

An OWS instance that access purely local data (nothing from AWS) fails on first data read if the AWS_DEFAULT_REGION environment variable is not set.

Possible solutions:

  1. Document current behaviour.
  2. Allow config option to say "this is a local data only OWS instance", in which case the AWS setting is not required. This may require extensive changes to the way datacube initialisation is done by OWS.

WMS GetMap fails at larger scales when WIDTH and HEIGHT unequal

  • datacube-wms version: Testing carried out using WMS endpoint http://wms.datacube.org.au
  • Python version: N/A
  • Operating System: N/A

Description

At scales above ~1:800,000 (when viewing in QGIS), GetMap requests to the service fail with a 500 status code if the WIDTH and HEIGHT URL parameters are not equal. The issue does not occur if the WIDTH and HEIGHT URL parameters are equal.

What I Did

Testing was carried out using an in-house conformance testing script, QGIS to invoke GetMap requests on the WMS, and hand entering GetGetMap requests into a browser.

Example of successful GetMap request with non-equal WIDTH and HEIGHT URL parameters (1:800,000 scale) -
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&BBOX=-1247885.831948718987,-2188065.894372453447,-1054337.831751300022,-2053615.227568647126&CRS=EPSG:3577&WIDTH=1143&HEIGHT=794&LAYERS=ls8_level1_usgs__111&STYLES=&FORMAT=image/png&DPI=120&MAP_RESOLUTION=120&FORMAT_OPTIONS=dpi:120&TRANSPARENT=TRUE

Example of unsuccessful GetMap request with non-equal WIDTH and HEIGHT URL parameters (1:1,000,000 scale) –
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&BBOX=-1272079.331973396242,-2204872.227722929325,-1030144.331726622768,-2036808.894218171481&CRS=EPSG:3577&WIDTH=1143&HEIGHT=794&LAYERS=ls8_level1_usgs__111&STYLES=&FORMAT=image/png&DPI=120&MAP_RESOLUTION=120&FORMAT_OPTIONS=dpi:120&TRANSPARENT=TRUE

Above GetMap request modified with equal WIDTH and HEIGHT URL parameters, successfully returns an image (albeit with a distorted aspect ratio) –
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&BBOX=-1272079.331973396242,-2204872.227722929325,-1030144.331726622768,-2036808.894218171481&CRS=EPSG:3577&WIDTH=800&HEIGHT=800&LAYERS=ls8_level1_usgs__111&STYLES=&FORMAT=image/png&DPI=120&MAP_RESOLUTION=120&FORMAT_OPTIONS=dpi:120&TRANSPARENT=TRUE

WMS fail to relate alias band names in configs to formal band name in the datacube

  • datacube-wms version: Testing carried out with a local installation running in VDI
  • Python version: 3.6.5

Description

WMS server failed for ls8_nbar_scene product being unable to identify alias band names. Of course this depends on how the config files are specified. The alias name resolution is required in number of places. From what I have come across both DataStacker and perhaps the base class StyleDefBase would require this knowledge.

Solution

I have simply added a dictionary to map the alias band names to the corresponding formal name in both DataStacker and StyleDefBase but this may not be the ideal solution. Perhaps datacube-core could provide band matching capability.

WCS GetCapabilites validation issues

Description

XML Schema validation of the WCS 1.0.0 GetCapabilities document failed.

What I Did

Validated the WCS 1.0.0 GetCapabilities document using Oxygen XML editor, Saxon EE XML Schema validation engine. Validation was carried out based on XSD referenced in the schema location attribute of the GetCaps document. Following issues were found:

  1. /WCS_Capabilities/Service/description element in wrong location.

    Currently:

<Service>
    <name>WCS</name>
    <label>Digital Earth Australia Surface Reflectance 25m Geomedian</label>
    <description>
        Data is only visible …..
    </description>

Should be:

<Service>
    <description>
        Data is only visible …..
    </description>
    <name>WCS</name>
    <label>Digital Earth Australia Surface Reflectance 25m Geomedian</label>
  1. /WCS_Capabilities/Service/responsibleParty/contactInfo/phone element in wrong location.

    Currently:

        <contactInfo>
            <address>
                <deliveryPoint>GPO Box 378</deliveryPoint>
                <city>Canberra</city>
                <administrativeArea>ACT</administrativeArea>
                <postalCode>2906</postalCode>
                <country>Australia</country>
            </address>
            <phone>
                <voice>+61 2 6249 9111</voice>
            </phone>

Should be:

        <contactInfo>
            <phone>
                <voice>+61 2 6249 9111</voice>
            </phone>
            <address>
                <deliveryPoint>GPO Box 378</deliveryPoint>
                <city>Canberra</city>
                <administrativeArea>ACT</administrativeArea>
                <postalCode>2906</postalCode>
                <country>Australia</country>
            </address>
  1. /WCS_Capabilities/Service/responsibleParty/contactInfo/address/electronicMailAddress element in wrong location.

    Currently:

<contactInfo>
            <address>
                <deliveryPoint>GPO Box 378</deliveryPoint>
                <city>Canberra</city>
                <administrativeArea>ACT</administrativeArea>
                <postalCode>2906</postalCode>
                <country>Australia</country>
            </address>
            <phone>
                <voice>+61 2 6249 9111</voice>
            </phone>
            <address>
                <electronicMailAddress>[email protected]</electronicMailAddress>
            </address>

Should be:

        <contactInfo>
            <phone>
                <voice>+61 2 6249 9111</voice>
            </phone>
            <address>
                <deliveryPoint>GPO Box 378</deliveryPoint>
                <city>Canberra</city>
                <administrativeArea>ACT</administrativeArea>
                <postalCode>2906</postalCode>
                <country>Australia</country>
                <electronicMailAddress>[email protected]</electronicMailAddress>
            </address>
  1. /WCS_Capabilities/ContentMetadata/CoverageOfferingBrief/description elements in wrong location.
    For example:
<ContentMetadata>
    <CoverageOfferingBrief>
        <name>ls8_nbart_geomedian_annual</name>
        <label>Surface Reflectance 25m Annual Geomedian (Landsat 8)</label>
        <description>Surface Reflectance…</description>

Should be:

<ContentMetadata>
    <CoverageOfferingBrief>
        <description>Surface Reflectance …</description>
        <name>ls8_nbart_geomedian_annual</name>
        <label>Surface Reflectance 25m Annual Geomedian (Landsat 8)</label>

URIs in WMS 1.3.0 GetCapabilities document missing http:// protocol

  • datacube-wms version: Testing carried out using WMS endpoint http://wms.datacube.org.au
  • Python version: N/A
  • Operating System: N/A

Description

URIs provided in the WMS 1.3.0 GetCapabilities document are missing the http:// protocol, e.g.:
xlink:href="alb-wms-dev-1627146768.us-west-2.elb.amazonaws.com/?"
should be
xlink:href="http://alb-wms-dev-1627146768.us-west-2.elb.amazonaws.com/?"

The following GetCapabilities elements contain xlink:href attributes with URIs that have the missing http:// protocol:

/WMS_Capabilities/Service/OnlineResource
/WMS_Capabilities/Capability/Request/GetCapabilities/DCPType/HTTP/Get/OnlineResource
/WMS_Capabilities/Capability/Request/GetMap/DCPType/HTTP/Get/OnlineResource
/WMS_Capabilities/Capability/Request/GetFeatureInfo/DCPType/HTTP/Get/OnlineResource

This is causing desktop GIS applications including QGIS and ArcMap to fail to make GetMap and GetFeature requests to the WMS.

What I Did

QGIS by default inspects the GetCapabilities statement for the GetMap and GetFeatureInfo URIs in the respective OnlineResource elements. Unless QGIS has the "Ignore GetMap URI reported in capabilities" and "Ignore GetFeatureInfo URI reported in capabilities" options set for the WMS, it attempts to use the URIs provided in the getcaps for GetMap and GetFeatureInfo requests which fail because the http:// protocol is missing.

I believe the same issue occurs in ArcMap, although it does not appear to have the option of ignoring the URIs provided in the getcaps.

Small scale GetMap requests timeout with a 504 error

Description

When zoomed out beyond scales of ~1:500,000 (e.g. using QGIS), up to the zoom threshold where the imagery is replaced by the satellite data footprint, GetMap requests timeout with a 504 error.

What I Did

Example GetMap request:
https://nrt-au.dea.ga.gov.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&BBOX=-12.31756108428356988,129.8093713543225647,-11.18898478344616088,131.4880219982633491&CRS=EPSG:4326&WIDTH=1181&HEIGHT=794&LAYERS=s2b_nrt_granule_nbar_t&STYLES=simple_rgb&FORMAT=image/png&DPI=120&MAP_RESOLUTION=120&FORMAT_OPTIONS=dpi:120&TRANSPARENT=TRUE

GetFeatureInfo requests fail with CRS=EPSG:3577

Description

GetFeatureInfo requests using CRS EPSG:3577 fail (the other two supported CRSs are ok).

What I Did

Example GetFeatureInfo request (generated from QGIS):

http://nrt-au.dea.ga.gov.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=-485606.44694445608183742,-2702488.34226180706173182,-457091.30575169820804149,-2676283.45556383300572634&CRS=EPSG:3577&WIDTH=864&HEIGHT=794&LAYERS=s2b_nrt_granule_nbar_t&STYLES=&FORMAT=image/png&QUERY_LAYERS=s2b_nrt_granule_nbar_t&INFO_FORMAT=application/json&I=266&J=218&FEATURE_COUNT=10

Error response:

<?xml version='1.0' encoding="UTF-8"?>
<ServiceExceptionReport version="1.3.0"
xmlns="http://www.opengis.net/ogc"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.opengis.net/ogc
http://schemas.opengis.net/wms/1.3.0/exceptions_1_3_0.xsd">
    <ServiceException>Unexpected server error: &#39;easting&#39;</ServiceException>
    <ServiceException>
        <![CDATA[ <FrameSummary file /code/datacube_wms/wms.py, line 47 in wms_impl> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/data.py, line 506 in feature_info> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/data.py, line 387 in feature_info> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/wms_utils.py, line 111 in img_coords_to_geopoint> ]]>
    </ServiceException>
</ServiceExceptionReport>

Consistency of dates between services

A partner agency is attempting to integrate multiple WMS services into a portal, but is struggling with the list of times as reported by GetCapabilities.

datacube-wms gives dates in the format 2001-12-27, but other services (such as GSKY) give dates in the format 1989-01-01T00:00:00.000Z

The WMS spec Annex D specifies that both formats are valid, but it has been requested that we look into what we can do to make integration simpler.

WMS GetFeatureInfo requests fail for EPSG 3577 and EPSG 3857

  • datacube-wms version: Testing carried out using WMS endpoint http://wms.datacube.org.au
  • Python version: N/A
  • Operating System: N/A

Description

GetFeatureInfo requests using EPSG 3577 and EPSG 3857 intermittently fail with “502 Bad Gateway”.
Successful requests are most likely with smaller BBOX extents (i.e. zoomed in further).

Note that this issue was originally described in issue #6. It has been moved here because the original submission was identified as being two issues.

What I Did

Testing was carried out using an in-house conformance testing script, QGIS to invoke GetFeatureInfo requests on the WMS, and hand entering GetFeatureInfo requests into a browser.

Example GetFeatureInfo request with EPSG 3857 (larger scale) - successful:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=13552947.02654318138957024,-2475760.35569502366706729,13639464.79104014113545418,-2392995.16893528262153268&CRS=EPSG:3857&WIDTH=830&HEIGHT=794&LAYERS=ls8_level1_usgs__110&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_level1_usgs__110&INFO_FORMAT=application/json&I=401&J=393&FEATURE_COUNT=10

Example GetFeatureInfo request with EPSG 3857 (smaller scale) - fail:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=13522196.7367520947009325,-2527149.8230408076196909,13695232.26574601046741009,-2361619.44952132552862167&CRS=EPSG:3857&WIDTH=830&HEIGHT=794&LAYERS=ls8_level1_usgs__110&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_level1_usgs__110&INFO_FORMAT=application/json&I=216&J=336&FEATURE_COUNT=10

WMS GetFeatureInfo requests fail

Description

GetFeatureInfo requests using EPSG 4326 fail with error “Unexpected server error: 'x'”

GetFeatureInfo requests using EPSG 3577 and EPSG 3857 intermittently fail with “502 Bad Gateway”.
Successful requests are most likely with smaller BBOX extents (i.e. zoomed in further).

What I Did

Testing was carried out using an in-house conformance testing script, QGIS to invoke GetFeatureInfo requests on the WMS, and hand entering GetFeatureInfo requests into a browser.

Example GetFeatureInfo request with EPSG 4326 - fail:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=121.79,-23.50,122.49,-22.83&CRS=EPSG:4326&WIDTH=830&HEIGHT=794&LAYERS=ls8_level1_usgs__110&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_level1_usgs__110&INFO_FORMAT=application/json&I=430&J=389&FEATURE_COUNT=10

Example GetFeatureInfo request with EPSG 3857 (larger scale) - successful:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=13552947.02654318138957024,-2475760.35569502366706729,13639464.79104014113545418,-2392995.16893528262153268&CRS=EPSG:3857&WIDTH=830&HEIGHT=794&LAYERS=ls8_level1_usgs__110&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_level1_usgs__110&INFO_FORMAT=application/json&I=401&J=393&FEATURE_COUNT=10

Example GetFeatureInfo request with EPSG 3857 (smaller scale) - fail:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=13522196.7367520947009325,-2527149.8230408076196909,13695232.26574601046741009,-2361619.44952132552862167&CRS=EPSG:3857&WIDTH=830&HEIGHT=794&LAYERS=ls8_level1_usgs__110&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_level1_usgs__110&INFO_FORMAT=application/json&I=216&J=336&FEATURE_COUNT=10

ODC-WMS Geomedian Layers not returning NDVI or NDBI in GetFeatureInfo

Description

As a user interested in NDVI or NDBI, I would like to see the calculated NDVI or NDBI in addition to the other 6 bands in response to a GetFeatureInfo request (without having to calculate it in my head).

What I Did

Navigate to http://terria-cube.terria.io/
Add the Surface Reluctance Annual Geomedian Landsat 8 layer from the ows.services.devkube.dea.ga.gov.au folder.
Click anywhere where there is data
Observe the GetFeatureInfo panel displaying the 6 Landsat bands (blue, green, red, nir, swir1, swir2) but not NDVI or NDBI.

Incorrect CRS axis order

  • datacube-wms version: Testing carried out using WMS endpoint http://wms.datacube.org.au
  • Python version: N/A
  • Operating System: N/A

Description

Service does not adhere to EPSG coordinate axis order specifications, a requirement introduced in WMS 1.3.0 (most geographic CRSs in the EPSG database specify an axis order of y,x rather than x,y order used for all CRSs in WMS versions prior to 1.3.0).

Incorrect coordinate axis order in WMS 1.3.0 GetCapabilities document BoundingBox elements with CRS=”EPSG:4326”, e.g:
<BoundingBox CRS="EPSG:4326" minx="129.28" maxx="136.42" miny="-32.81" maxy="-10.51" />
should be
<BoundingBox CRS="EPSG:4326" minx="-32.81" maxx="-10.51" miny="129.28" maxy="136.42" />

BBOX URL parameter in WMS 1.3.0 GetMap requests for EPSG:4326 should be of form:
BBOX=<min_lat>,<min_lon>,<max_lat>,<max_lon>
Although this service only accepts requests with the following (incorrect) form:
BBOX=<min_lon>,<min_lat>,<max_lon>,<max_lat>

What I Did

Issue was identified using an in-house conformance testing script, and by physically inspecting the GetCapabilities statement at http://wms.datacube.org.au/?service=WMS&request=GetCapabilities. The issue was confirmed by attempting to make GetMap requests against the service, both manually in a browser and using QGIS.

Example GetMap request - incorrect BBOX axis order, image returned:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&LAYERS=ls8_level1_usgs__110&STYLES=&WIDTH=1000&HEIGHT=1000&FORMAT=image/png&CRS=EPSG:4326&DPI=120&MAP_RESOLUTION=120&FORMAT_OPTIONS=dpi:120&TRANSPARENT=TRUE&BBOX=121.8,-24.8,122.48,-24.18

Same request with correct BBOX axis order, no image returned:
http://wms.datacube.org.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&LAYERS=ls8_level1_usgs__110&STYLES=&WIDTH=1000&HEIGHT=1000&FORMAT=image/png&CRS=EPSG:4326&DPI=120&MAP_RESOLUTION=120&FORMAT_OPTIONS=dpi:120&TRANSPARENT=TRUE&BBOX=-24.8,121.8,-24.18,122.48

Legend generation with Matplotlib is not thread-safe

Automatic legend generation for the WMS GetLegendGraphic method uses the Matplotlib library. According to Matplotlib's website:

Matplotlib is not thread-safe: in fact, there are known race conditions that affect certain artists. Hence, if you work with threads, it is your responsibility to set up the proper locks to serialize access to Matplotlib artists.

GetLegendGraphic is called relatively rarely in normal TerriaJS/NationalMap usage and with CloudFront caching it would be very difficult to deliberately force an issue in the wild.

Nevertheless, in the short term we should implement a locking framework to serialise access to Matplotlib. In the longer term we should research switching to a thread-safe plotting library.

Add support for image-file legends at the style level

Description

Users of WOfS would like to see the custom-built legend graphic for the rainbow colour ramp for the "Filtered Water Summary" style, but still use the generated legend for the "Water Summary (Blue)" style.

The current implementation allows for a legend graphic URL to be set for the whole layer, but not for individual styles.

What I Would Like To Do

Use the config:

                "styles": [
                    {
                        "title": "Filtered Water Summary",
                        ...
                        "color_ramp": [...],
                        "legend": {
                            "url": "http://example.com/some_file.png"
                        }
                    },
                    {
                        "title": "Water Summary (Blue)",
                        ...
                        "color_ramp": [...],
                        "legend": {
                            "units": "%",
                            "radix_point": 0,
                            "scale_by": 100.0,
                            "major_ticks": 0.1
                        }
                    },
                ],

There's no way to request to ignore lineage for the indexing system

  • datacube-wms version: current
  • Python version: 3.6
  • Operating System: all

Description

In loading a dataset with lineage into a system that doesn't contain that lineage, there is an error raised by datacube saying 'no matching product...'.

The fix is apparently to add the flag --ignore-lineage to dc-index-from tar here: https://github.com/opendatacube/datacube-ows/blob/master/docker/auxiliary/index-k/assets/update_ranges.sh#L68 (thanks @tom-butler)

Breaking changes in upcoming datacube-core release

  • datacube-wms version: latest
  • datacube-core version: develop 21/02/19
  • new_datasource()

Unexpected server error: new_datasource() takes 1 positional argument but 2 were given

as per: https://datacube-core.readthedocs.io/en/latest/_modules/datacube/api/core.html?highlight=new_datasource

new data source only takes one argument, type: datacube.storage.BandInfo

  • RasterDatasetDataSource

Unexpected server error: 'RasterDatasetDataSource' object has no attribute '_dataset'

After fixing the new_datasource call above, it's become apparent that the returned object has changed, It no longer has _dataset.id as an available parameter.

datasources = sorted(datasources, key=lambda x: x._dataset.id)

These issues will need to be fixed when the WMS is updated to use the next version of core

ODC-WMS GetFeatureInfo doesn't always return a result

Description

Using National Map or Terria, I want to use the TerriaJS to use the "Feature Information" tool to query a single pixel of data.

What I Did

  1. Open http://terria-cube.terria.io/
  2. Add Water Observations from Space Clear Count layer (from ows.services.devkube.dea.ga.gov.au folder)
  3. Click anywhere on the map that has data
  4. Observe an "n/a" for "count_clear" field

WMS GetMap and GetFeatureInfo requests erroneously require SERVICE parameter

Description

GetMap and GetFeatureInfo requests to the datacube-wms service fail if the SERVICE parameter is not provided. The OGC WMS 1.3.0 specification does not include the SERVICE parameter in the list of parameters to be used for GetMap and GetFeatureInfo requests (see tables 8 and 9 in OGC document 06-042 "OpenGIS® Web Map Server Implementation Specification").

An external DEA client has reported that MapInfo software is unable to access the DEA WMS due to this issue. Other OGC client software, including QGIS, ArcMap and ArcGIS Pro do not adhere as strictly to the OGC spec, and are able to access the WMS.

What I Did

GetMap

The following GetMap request, which does not include the SERVICE parameter, returns an error, although should successfully return a map image:

https://ows.services.dea.ga.gov.au/?VERSION=1.3.0&REQUEST=GetMap&LAYERS=ls8_barest_earth_mosaic&STYLES=&CRS=EPSG:4326&BBOX=-40.898798,111.492400120054,-12.263662,155.066563941708&WIDTH=512&HEIGHT=336&FORMAT=image/png&TRANSPARENT=TRUE&BGCOLOR=0xFFFFFF&EXCEPTIONS=XML

The following non-OGC compliant GetMap request includes the SERVICE parameter and successfully returns a map image:

https://ows.services.dea.ga.gov.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&LAYERS=ls8_barest_earth_mosaic&STYLES=&CRS=EPSG:4326&BBOX=-40.898798,111.492400120054,-12.263662,155.066563941708&WIDTH=512&HEIGHT=336&FORMAT=image/png&TRANSPARENT=TRUE&BGCOLOR=0xFFFFFF&EXCEPTIONS=XML

GetFeatureInfo

The following GetFeatureInfo request, which does not include the SERVICE parameter, returns an SERVICE parameter related error, although should successfully return a GetFeatureInfo response:

https://ows.services.dea.ga.gov.au/?VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=-31.17650422142536115,116.14336705755128776,-29.08774195157069187,119.8674488075862854&CRS=EPSG:4326&WIDTH=1273&HEIGHT=714&LAYERS=ls8_barest_earth_mosaic&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_barest_earth_mosaic&INFO_FORMAT=application/json&I=652&J=331&FEATURE_COUNT=10

The following non-OGC compliant GetFeatureInfo request includes the SERVICE parameter and does not return a SERVICE parameter related error (a different error is returned unrelated to the SERVICE parameter issue):

https://ows.services.dea.ga.gov.au/?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&BBOX=-31.17650422142536115,116.14336705755128776,-29.08774195157069187,119.8674488075862854&CRS=EPSG:4326&WIDTH=1273&HEIGHT=714&LAYERS=ls8_barest_earth_mosaic&STYLES=&FORMAT=image/png&QUERY_LAYERS=ls8_barest_earth_mosaic&INFO_FORMAT=application/json&I=652&J=331&FEATURE_COUNT=10

A product with broken temporal ranges shouldn't break the wms

  • datacube-wms version:latest
  • Python version:3.6
  • Operating System:docker opendatacube/wms:latest

Description

If a single product has missing time data the entire WMS is broken as the getCaps cannot be generated.

<FrameSummary file /code/datacube_wms/ogc.py, line 184 in ogc_svc_impl> <FrameSummary file /code/datacube_wms/utils.py, line 14 in log_wrapper> <FrameSummary file /code/datacube_wms/wms.py, line 27 in handle_wms> <FrameSummary file /code/datacube_wms/utils.py, line 14 in log_wrapper> <FrameSummary file /code/datacube_wms/wms.py, line 51 in get_capabilities> <FrameSummary file /code/datacube_wms/wms_layers.py, line 362 in get_layers> <FrameSummary file /code/datacube_wms/wms_layers.py, line 346 in __init__> <FrameSummary file /code/datacube_wms/wms_layers.py, line 320 in __init__> <FrameSummary file /code/datacube_wms/wms_layers.py, line 158 in __init__> <FrameSummary file /code/datacube_wms/product_ranges.py, line 559 in get_ranges>

Instead the get_ranges function should drop the broken product to reduce the impact of a single issue on the entire web service.

This issue caused us ~15 minutes of outage this morning: https://status.dea.ga.gov.au/incidents/qY7WS5Xbvlz2

GetFeatureInfo sending 500 error when no imagery at location on date specified

Hey datacube-ows team

Description

A GetFeatureInfo request returns a 500 error response when a date is specified in conjunction with a bbox for which there is no combination (eg there is no imagery at that location on the date specified)

What I Did

  • A call to GetFeatureInfo which works

  • A call to GetFeatureInfo which doesn't work

  • The only difference between the two requests are the bbox and the x, y params.

The error message generated by the second request is below. The error references the time parameter that is sent as part of the request.

<ServiceExceptionReport version="1.3.0" ...>
    <ServiceException>Unexpected server error: datetime.date(2019, 9, 17)</ServiceException>
    <ServiceException>
        <![CDATA[ <FrameSummary file /code/datacube_wms/ogc.py, line 188 in ogc_svc_impl> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/utils.py, line 14 in log_wrapper> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/wms.py, line 31 in handle_wms> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/utils.py, line 14 in log_wrapper> ]]>
        <![CDATA[ <FrameSummary file /code/datacube_wms/data.py, line 464 in feature_info> ]]>
    </ServiceException>
</ServiceExceptionReport>

The behaviour can be replicated on the map by clicking 'Filter by location' and then clicking anywhere that is not blue (indicating the extent of the available imagery for the selected day).
https://nationalmap.gov.au/#share=s-scPxptiqGPiUC81u8Xoi6D1PkfE

Thanks,
Rowan

Multiproducts cannot query seperate products as pq_data

Description

When attempt to perform a GetMap on a multiproduct where products have a different pq product the request will fail with an error similar to Unexpected server error: '<pq band>'

The DataStacker::datasets function has a branch which will cause the mask argument to be ignored when attempting to query the pq datasets:

if self._product.multi_product:
and results in the primary product names being queried instead of the pq product names for those product

WMTS arcgis support

  • datacube-wms version: latest
  • Python version: 3.6.7
  • Operating System: Ubuntu (opendatacube/wms:0.8.5-unstable.8.g116163a)

Description

WMTS doesn't support the tilematrixset used by arcgis online 'WholeWorld_WebMercator'

What I Did

  1. Loaded https://ows.services.devkube.dea.ga.gov.au/wmts into arcgis online
  2. Selected ls8_barest_earth_mosaic layer

Inspecting the network requests show that arcgis uses the following request

https://ows.services.devkube.dea.ga.gov.au/wmts?SERVICE=WMTS&VERSION=1.0.0&REQUEST=GetTile&LAYER=ls8_barest_earth_mosaic&STYLE=simple_rgb&FORMAT=image/png&TILEMATRIXSET=WholeWorld_WebMercator&TILEMATRIX=8&TILEROW=144&TILECOL=231

This fails with Service Exception: Invalid Tile Matrix Set: WholeWorld_WebMercator

For comparison a working request from nationalmap is

https://ows.services.devkube.dea.ga.gov.au/wmts?tilematrix=9&layer=ls8_barest_earth_mosaic&style=simple_rgb&tilerow=301&tilecol=450&tilematrixset=urn%3Aogc%3Adef%3Awkss%3AOGC%3A1.0%3AGoogleMapsCompatible&format=image%2Fpng&service=WMTS&version=1.0.0&request=GetTile

It'd be cool if WMTS worked out of the box with arcgis online (for story maps etc)

WCS DescribeCoverage validation issues

Description

XML Schema validation of the WCS 1.0.0 DescribeCoverage document failed.

What I Did

Validated the WCS 1.0.0 DescribeCoverage document for coverage ls8_nbart_geomedian_annual using Oxygen XML editor, Saxon EE XML Schema validation engine. Validation was carried out based on XSD referenced in the schema location attribute of the DescribeCoverage document. Following issues were found:

  1. /CoverageDescription/CoverageOffering element in location.

    Currently:

    <CoverageOffering>
        <name>ls8_nbart_geomedian_annual</name>
        <label>Surface Reflectance 25m Annual Geomedian (Landsat 8)</label>
        <description>Surface Reflectance Geometric …</description>

Should be:

    <CoverageOffering>
        <description>Surface Reflectance Geometric …</description>
        <name>ls8_nbart_geomedian_annual</name>
        <label>Surface Reflectance 25m Annual Geomedian (Landsat 8)</label>

WMS GetMap not supporting various types of output format

  • datacube-wms version: Most recent version
  • Python version: 3.6.7
  • Operating System: Ubuntu 18.04

I've managed to set up datacube-ows on localhost so that a GetCapabilities request for a couple of different products from the Mongolian Data Cube shows up correctly, based off a wms_cfg_local.py file.

Additionally when I run a GetMap request to try and show an image in browser, the following works but only if I set the format to be image/png:

http://localhost:5000/?service=WMS
&request=GetMap
&version=1.3.0
&layers=10DayIndices
&styles=ndvi
&crs=EPSG:32648
&bbox=399600,5299260,499560,5399220
&width=4998
&height=4998
&format=image/png
&time=2019-06-20

If I try to set the format to image/geotiff, I get the following result:
<ServiceExceptionReport version="1.3.0" xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wms/1.3.0/exceptions_1_3_0.xsd"><ServiceException code="InvalidFormat" locator="format parameter">image format image/geotiff is not supported</ServiceException></ServiceExceptionReport>

Having looked through the wms_cfg_local.py file again, I see there is a supported formats section (in service_cfg) for WCS (which I have currently have been disabling) but not for WMS. Is it possible to add in GeoTiff as a supported format for WMS in the config file, or does this need to be enabled elsewhere? Any help with this issue would be much appreciated :)

Use Rasterio session instead of editing _creds directly

  • datacube-wms version: 0.7.1
  • Python version: 3.6
  • Operating System: Ubuntu (Docker Image)

Background

The current version of datacube-wms modifies _creds inside the rasterio enviroment to insert STS access keys. This is unsupported and will most likely break in a later version of rasterio. https://github.com/opendatacube/datacube-wms/blob/master/datacube_wms/data.py#L61

To minimise the chance of a session expiring we are refreshing the keys each time we read a file, as rasterio can take some time to create a new environment it would be more efficient if we create one rasterio environment per dask worker using the dask client.run() function to initialize the worker.

Unfortunately it looks like the current version of rasterio doesn't support refreshing credentials on expiration. So we would need to use Access keys (instead of roles) to create the rasterio session.

Recommendation

  • use client.run to create the rasterio environment
  • inject keys via env variables
  • update client.run to use the rasterio session

PR incoming soon.

GetFeatureInfo failing on different region datasets

  • datacube-wms version:latest
  • Python version:
  • Operating System:

Description

Currently the getFeatureinfo is looking up for the S3 datasets to fetch the info for datalinks and the region is set to ap-southeast-2 in regex match

What I Did

Testing GetFeatureInfo for DEAfrica datasets where the current region is us-west , Once the data is loaded and requested for GetFeatures, no errors returned but fails for request

Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.

QGIS TimeManger support for WMS

QGIS Time Manager plugin uses a range of dates covering the time period set in "Time frame size"

Description

The QGIS time plugin uses the start/end style of dates for the time range eg ?TIME=2008-01-01/2009-01-01
In the case where there is only one time in that range, we should display it.
In the case where there are multiple times for the layer, we could try to do something more complicated, or just default to the first layer?

What I Did

  • Add a WMS layer
  • Plugins -> TimeManger -> Toggle visibility
  • TimeManager window -> Add raster -> select raster, enter "2008-01-01" and "2018-01-01" as times
  • TimeManager window -> Time frame size -> 1 years
  • Log Message Panel -> WMS -> displays:

2018-11-05T11:06:19 1 Map request failed [error:Error downloading https://geomedian.services.dea.ga.gov.au/?TIME=2008-01-01/2009-01-01&&SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&BBOX=-38.28853216524900915,110.7088478924427761,-8.956027408621759278,157.0991156010999816&CRS=EPSG:4326&WIDTH=658&HEIGHT=416&LAYERS=ls8_nbart_geomedian_annual&STYLES=simple_rgb&FORMAT=image/png&DPI=95&MAP_RESOLUTION=95&FORMAT_OPTIONS=dpi:95&TRANSPARENT=TRUE - server replied: Bad Request url:https://geomedian.services.dea.ga.gov.au/?TIME=2008-01-01/2009-01-01&&SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&BBOX=-38.28853216524900915,110.7088478924427761,-8.956027408621759278,157.0991156010999816&CRS=EPSG:4326&WIDTH=658&HEIGHT=416&LAYERS=ls8_nbart_geomedian_annual&STYLES=simple_rgb&FORMAT=image/png&DPI=95&MAP_RESOLUTION=95&FORMAT_OPTIONS=dpi:95&TRANSPARENT=TRUE]

TerriaJS WCS GeoTIFF into QGIS

Description

Using TerriaJS WCS clip and ship functionality, get a GeoTIFF from ODC WCS and load the GeoTIFF into QGIS successfully.

Acceptance Criteria

Works with QGIS versions 2.14 and 3.2

ODC-WMS WCS returns upside down raster from wcs-clip-and-ship.terria.io

  • datacube-wms version: latest in dev (checked Tuesday evening 25th September)
  • Python version: whatever ows.services.devkube.dea.ga.gov.au is using
  • Operating System: whatever ows.services.devkube.dea.ga.gov.au is using

Description

When I execute a clip and ship via Terria, the Geotiff returned is upside down when I load it in QGIS.

What I Did

Steps to reproduce:
Access http://wcs-clip-and-ship.terria.io/
Add this terria config: https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/dev/terria/dea.json
Add the development WOfS Summary layer (it should be backed by ows.services.devkube.dea.ga.gov.au)
Define a modest area of interest (large requests will fail at the moment)
Open the resulting GeoTIFF in QGIS
This was executed from the VDI using their versions of Firefox and QGIS

Health Check URL

Description

I'd like a /healthz interface that checks gunicorn can recieve new requests, and the database connection is healthy, this would be used by our load balancer to avoid pods that are overwhelmed, and for a black box monitor that checks if our webservices are ok.

What I Did

We're currently using get_capabilities as a method of validating the service health, but it's pretty heavy handed (taking 3455.02ms av for a response) and is using more CPU than I'd like for a simple health check.

There is a benefit to this approach, it will also test if there is a mis-configuration between the database and the config. This could potentially be included, but without the expensive temporal requests.

Documentation suggestions

  • datacube-wms version: Most recent version
  • Python version: 3.7
  • Operating System: Ubuntu 18.04

Description

I've been running through an install of datacube-ows and have noticed a number of changes/omissions in the initial documentation and also have a number of suggested changes which could help make the initial set up and documentation of the process easier.

  • The actual GitHub link to download datacube-ows is not in the front page set-up steps and only actually in the installation section in the documentation. As someone coming in who isn't massively aware of what's in each one of the various ODC repos, I thought perhaps the ows code was somewhere in dea-proto (which looks like is no longer a requirement for datacube-ows).
  • Perhaps having the GitHub download link in the set up steps or merging the installation & set up steps could help reduce some of this confusion?
  • I needed to separately install the colour module in order to run update_ranges.py
  • The update_ranges.py function run had some additional run parameters which needed to be run in order to get things set up.
./update_ranges.py --role postgres --schema
./update_ranges.py --product *product_name* --no-calculate-extent

This seemed to work for me, I don't know if there is a different way to do this which works.

  • In order to get flask to work, needed to do: pip install flask-log-request-id

Hope this feedback helps! I can help with documentation updates if that would be helpful if people have any comments regarding the points made above.

Multi-Product products cannot be inserted into the multiproduct_ranges table in database

If a multiproduct product does not already have an entry in the wms.multiproduct_ranges it cannot be added using update_ranges.py.
For example:python3 update_ranges.py --multiproduct s2_nrt_granule_nbar_t --merge-only will result in:

ERROR:datacube_wms.wms_layers:Could not load layer: Could not find ranges for s2a_nrt_granule in database
WARNING:root:Native CRS for product s2b_nrt_granule (None) not in published CRSs
WARNING:root:Native CRS for product s2a_nrt_granule (None) not in published CRSs
Traceback (most recent call last):
  File "update_ranges.py", line 164, in <module>
    main()
  File "/usr/local/lib/python3.6/dist-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.6/dist-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.6/dist-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "update_ranges.py", line 56, in main
    p, u, i = update_range(dc, multiproduct, multi=True, follow_dependencies=not merge_only)
  File "/code/datacube_wms/product_ranges.py", line 382, in update_range
    raise Exception("Requested product not found.")
Exception: Requested product not found.

This seems to be because:

product = get_layers().product_index.get(product)

get_layers() will result in creating a ProductLayerDeffor the multiproduct product which attempts to return the range of the product

self.ranges = get_ranges(dc, self)
which will fail unless the product already exists in the table:
results = conn.execute("""
SELECT *
FROM wms.multiproduct_ranges
WHERE wms_product_name=%s""",
product.name
)

ODC-WMS WOfS Summary layers return the wrong fields for GetFeatureInfo

  • datacube-wms version: Whatever is deployed to ows.services.devkube.dea.ga.gov.au (checked morning of 28th of September)
  • Python version: Whatever is deployed to ows.services.devkube.dea.ga.gov.au (checked morning of 28th of September)
  • Operating System: Whatever is deployed to ows.services.devkube.dea.ga.gov.au (checked morning of 28th of September)

Description

Now that we have split the five WOfS summary measurements/bands into their own layers:

  1. Wet Count
  2. Clear Count
  3. Confidence
  4. Unfiltered Summary
  5. Filtered Summary

I would expect the GetFeatureInfo to only return the summary information for the current band.

What I Did

  1. Navigate to http://terria-cube.terria.io/
  2. Add the Water Observations from Space Clear Count layer from the ows.services.devkube.dea.ga.gov.au folder.
  3. Click anywhere where there is data/
  4. Observe the GetFeatureInfo panel displaying the additional count_wet and frequency bands in addition to the expected count_clear.

We will need to confirm this behaviour with the product owner.

Large tile request never returns?

Description

When using TerriaJS/Cesium to view one of the DEA Geomedian layers one if the requests that can happen is for a quarter of the world. This request seems to never return, while others do. Perhaps there is an edge case that causes it to error and die?

This is the URL:

https://geomedian.services.dea.ga.gov.au/?time=2017-01-01&srs=EPSG%3A3857&transparent=true&format=image%2Fpng&exceptions=application%2Fvnd.ogc.se_xml&styles=&tiled=true&feature_count=101&service=WMS&version=1.1.1&request=GetMap&layers=ls8_nbart_geomedian_annual&bbox=0%2C-20037508.342789244%2C20037508.342789244%2C0&width=256&height=256

GetMap and GetCoverage not displaying any data

  • datacube-wms version: master branch
  • Python version: 3.6.5
  • Operating System: Ubuntu 16.04

Description

Trying to host a WMS and WCS server and connecting with QGIS to view data with a GetCoverage request.

WCS issue

With QGIS3

2018-07-24T14:09:01     CRITICAL    Invalid Layer : WCS provider Cannot get test dataset.
             Raster layer Provider is not valid (provider: wcs, URI: cache=PreferNetwork&crs=EPSG:4326&dpiMode=7&format=GeoTIFF&identifier=s2a_level1c_granule&time=2017-12-17&url=http://localhost:5000/
2018-07-24T14:09:03     CRITICAL    Invalid Layer : WCS provider Cannot get test dataset.
             Raster layer Provider is not valid (provider: wcs, URI: cache=PreferNetwork&crs=EPSG:4326&dpiMode=7&format=netCDF&identifier=s2a_level1c_granule&time=2017-12-17&url=http://localhost:5000/

With a HTTP request in browser
http://localhost:5000/?SERVICE=WCS&REQUEST=GetCoverage&CRS=EPSG:4326&TIME=2017-08-26&COVERAGE=s2a_level1c_granule&VERSION=1.0.0&FORMAT=GeoTIFF&BBOX=47.75,15.85,48.65,16.65&WIDTH=1000&HEIGHT=1000

<ServiceExceptionReport xmlns="http://www.opengis.net/ogc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.2.0" xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wcs/1.0.0/OGC-exception.xsd">
<ServiceException>
Unexpected server error: 'ProductLayerDef' object has no attribute 'max_datasets_wcs'
</ServiceException>
<ServiceException>
<![CDATA[
<FrameSummary file /home/ubuntu/Datacube/datacube-wms/datacube_wms/ogc.py, line 44 in ogc_impl>
]]>
<![CDATA[
<FrameSummary file /home/ubuntu/Datacube/datacube-wms/datacube_wms/wcs.py, line 24 in handle_wcs>
]]>
<![CDATA[
<FrameSummary file /home/ubuntu/Datacube/datacube-wms/datacube_wms/wcs.py, line 107 in get_coverage>
]]>
<![CDATA[
<FrameSummary file /home/ubuntu/Datacube/datacube-wms/datacube_wms/wcs_utils.py, line 324 in get_coverage_data>
]]>
</ServiceException>
</ServiceExceptionReport>

WMS issue

The WMS GetMap request works as a URL passed with a HTTP request, but once imported in the qgis the layer is not visible. What is also weird when a large tile size is passed (eg. 1000) it zooms to the correct extent, but when a smaller is passed (eg. 10 or 100) it zooms to an extent with inverted coordinates. Not sure if that is on your part. I might try another qgis version to be sure as well.

QGIS direct access to ODC-WCS

  • datacube-wms version:
  • Python version:
  • Operating System:

Acceptance Criteria

Works with QGIS versions 2.14 and 3.2

WMS parameter values should not be case sensitive

  • datacube-wms version: all

Description

Parameters values in this WMS implementation are case sensitive, and I don't think they should be.

The standard says:

6.4.1. Parameter Ordering and Case
Parameter names shall not be case sensitive, but parameter values shall be case sensitive.

So the implementation is technically correct...

But GeoServer, for example, is NOT case sensitive for most things, including parameter values, where it doesn't have to be.

What I Did

Try to get capabilities at ?service=wms&request=getcapabilities

Note that it only works with ?service=WMS&request=GetCapabilities.

Incorrect axis order in GetCaps BoundingBox

Description

This is a re-submission of the axis order problem reported in issue #4, whereby the coordinate axis order in WMS 1.3.0 GetCapabilities document BoundingBox elements with CRS=”EPSG:4326” is incorrect, e.g:
<BoundingBox CRS="EPSG:4326" minx="129.28" maxx="136.42" miny="-32.81" maxy="-10.51" />
should be
<BoundingBox CRS="EPSG:4326" minx="-32.81" maxx="-10.51" miny="129.28" maxy="136.42" />

What I Did

I inspected the GetCapabilities statement obtained from https://nrt-au.wms.gadevs.ga/?service=WMS&request=GetCapabilities (I understand that this implementation of the WMS has the latest code updates, including changes made when issue #4 was closed).

I believe this is causing the WMS layers to fail to load in QGIS, as it appears to use the BoundingBox elements to determine how to make the GetMap requests.

QGIS not providing VERSION for GetCapabilites requests

  • datacube-wms version: current production?

Description

Trying to load a WMS layer into QGIS 2.14.8-Essen fails when no VERSION is supplied

What I Did

Add WMS/WMTS Layer -> New
URL: https://ows.services.dea.ga.gov.au/

Failed to download capabilities:
Download of capabilities failed: Error downloading https://ows.services.dea.ga.gov.au/?
SERVICE=WMS&REQUEST=GetCapabilities - server replied: Bad Request

Changing the URL to: https://ows.services.dea.ga.gov.au/?VERSION=1.3.0 worked.

WMS 1.3.0 GetCapabilities document fails XML Schema validation

  • datacube-wms version: Testing carried out using WMS endpoint http://wms.datacube.org.au
  • Python version: N/A
  • Operating System: N/A

Description

WMS 1.3.0 GetCapabilities document fails XML Schema validation.

The xsi:schemLocation attribute in the root element has an incorrect URL for the http://www.opengis.net/wms namespace -

http://schemas.opengis.net/wms/1.3.0/capabilities_1_2_0.xsd

should be

http://schemas.opengis.net/wms/1.3.0/capabilities_1_3_0.xsd

Document passes validation when the xsi:schemLocation attribute is corrected.

What I Did

Performed XML Schema validation using oXygen XML Editor on the WMS 1.3.0 GetCapabilities document at http://wms.datacube.org.au/?service=WMS&request=GetCapabilities.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.