Code Monkey home page Code Monkey logo

Comments (20)

lucapaganotti avatar lucapaganotti commented on June 25, 2024 1

Hi @SpacemanPaul, I got back on opendatacube the last week after a long work on other projects, I hope to have enough time to work on odc this month and the next. I'm beginning to view something in datacube ows. For the time being I'm able to view a map created from a simple netcdf file containing a single variable.

Now I'll try with a netcdf file containing more variables.

Schermata da 2023-06-08 12-12-04

Thanks for all your help.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

The error refers to line 170-175ish - the syntax of the pq_masks section has changed, see https://datacube-ows.readthedocs.io/en/latest/cfg_styling.html#bit-flag-masks-pq-masks Please note the disclaimer at the top of ows_example_cfg.py:

# The file was originally the only documentation for the configuration file format.
# Detailed and up-to-date formal documentation is now available and this this file
# is no longer actively maintained and may contain errors or obsolete elements.
#
# https://datacube-ows.readthedocs.io/en/latest/configuration.html

In any case, neither of the styles you have defined for the co3 layer (style_rgb and style_rgb_cloudmask) will work as both use the bands: red, green and blue - but the co3 layer only has bands c_o3_01, c_o3_02, c_o3_03 and c_o3_04.

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Thanks Paul, today I'll read the docs and try to define a suitable style.

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Hi @SpacemanPaul ,

I redefined the styles associated to the co3 layer and leave commented the pq_masks section for the time being,

restarting the server with flask gives me this:

 flask run --host=0.0.0.0
[2022-11-17 16:58:41,155] [WARNING] Environment variable $AWS_DEFAULT_REGION not set.  (This warning can be ignored if all data is stored locally.)
[2022-11-17 16:58:41,905] [WARNING] get_ranges failed for layer cO3: (psycopg2.errors.UndefinedTable) relation "wms.product_ranges" does not exist
LINE 3:                 FROM wms.product_ranges
                             ^

[SQL: 
                SELECT *
                FROM wms.product_ranges
                WHERE id=%s]
[parameters: (1,)]
(Background on this error at: https://sqlalche.me/e/14/f405)
 * Serving Flask app 'datacube_ows/ogc.py'
 * Debug mode: off
[2022-11-17 16:58:41,911] [INFO] WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on all addresses (0.0.0.0)
 * Running on http://127.0.0.1:5000
 * Running on http://192.168.88.86:5000
[2022-11-17 16:58:41,912] [INFO] Press CTRL+C to quit
[2022-11-17 16:59:00,212] [INFO] 192.168.88.87 - - [17/Nov/2022 16:59:00] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 16:59:58,144] [INFO] 192.168.88.87 - - [17/Nov/2022 16:59:58] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:10,867] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:10] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:45,425] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:45] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:46,113] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:46] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:46,745] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:46] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:00:47,249] [INFO] 192.168.88.87 - - [17/Nov/2022 17:00:47] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -
[2022-11-17 17:01:27,415] [INFO] 192.168.88.87 - - [17/Nov/2022 17:01:27] "GET /?SERVICE=WMS&REQUEST=GetCapabilities HTTP/1.1" 200 -

that its ok for the GetCapabilities requests, but still displays a pair of warnings, I can ignore the first as my data are all local, but I do not know why my database is lacking the product_ranges relation, anyway there seems to be only a warning.

I then tried a GetMap request this way

http://192.168.88.86:5000/wms?request=GetMap&service=WMS&version=1.3.0&crs=EPSG:32632&layers=cO3&bbox=457000,4943000,697000,5175000&width=800&height=600&format=image/png

I was not able to find docs about how this request has to be done and which KVP it accepts but proceeding in trying I finally got this xml response:

<ServiceExceptionReport xmlns="http://www.opengis.net/ogc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.3.0" xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wms/1.3.0/exceptions_1_3_0.xsd">
<ServiceException> Unexpected server error: 'NoneType' object is not subscriptable </ServiceException>
<ServiceException>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/ogc.py, line 152 in ogc_svc_impl> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms.py, line 29 in handle_wms> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/data.py, line 404 in get_map> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 301 in __init__> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in get_times> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in <listcomp>> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 205 in parse_time_item> ]]>
<![CDATA[ <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 169 in get_times_for_product> ]]>
</ServiceException>
</ServiceExceptionReport>

saying that I'm getting:

Unexpected server error: 'NoneType' object is not subscriptable

I checked my datacube's product and dataset, reread the ows config file many times, but I do not understand if I'm missing some setup parameter's value.

I'm trying to index a netcdf file with 24 time values, a 61x59 x-y grid and a single variable (c_O3), this is the output of the ncdump command displaying the netcdf header information:

ncdump -h ../../data/netcdf/test_no_zeta.nc
netcdf test_no_zeta {
dimensions:
	time = 24 ;
	y = 59 ;
	x = 61 ;
variables:
	double time(time) ;
		time:units = "hours since 1-1-1 00:00:0.0" ;
		time:delta_t = "0000-00-00 01:00:00.00 +00:00" ;
	double y(y) ;
		y:units = "metre" ;
		y:standard_name = "projection_y_coordinate" ;
		y:long_name = "Northing" ;
	double x(x) ;
		x:long_name = "Easting" ;
		x:standard_name = "projection_x_coordinate" ;
		x:units = "metre" ;
	float c_O3(time, y, x) ;
		c_O3:coordinates = "x y" ;
		c_O3:grid_mapping = "CRS" ;
		c_O3:units = "ppb" ;
		c_O3:_FillValue = -9.96921e+36f ;
		c_O3:actual_range = 6.292308e-06f, 93.20821f ;
	int CRS ;
		CRS:grid_mapping_name = "transverse_mercator" ;
		CRS:semi_major_axis = 6378137 ;
		CRS:inverse_flattening = 298.257223563 ;
		CRS:longitude_of_prime_meridian = 0. ;
		CRS:latitude_of_projection_origin = 0. ;
		CRS:longitude_of_central_meridian = 9. ;
		CRS:scale_factor_at_central_meridian = 0.9996 ;
		CRS:false_easting = 500000. ;
		CRS:false_northing = 0. ;
		CRS:unit = "metre" ;
		CRS:crs_wkt = "PROJCS[\"WGS 84 / UTM zone 32N\",GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",9],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32632\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]" ;
		CRS:spatial_ref = "PROJCS[\"WGS 84 / UTM zone 32N\",GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",9],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32632\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]" ;
		CRS:GeoTransform = "4000.0, 0.0, 455000.0, 0.0, -4000.0, 5177000.0" ;

// global attributes:
		:Conventions = "COARDS" ;
		:lib_ver = 20000 ;
		:creation_time = "11  4 2015  H 11.09.06.997 (system local time)" ;
		:description = "" ;
		:model = "FARM" ;
		:NCO = "netCDF Operators version 4.7.5 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)" ;
}

What am I missing?

I will then try a simpler use case with only one time value to see if I will finally view an image about my test data.

Thank you for any answer.

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Hi all,
the error on line 169 in wms_utils.py file is on the last line of this python function:

def get_times_for_product(product):
    ranges = product.ranges
    return ranges['times']

can this be related to the missing wms.product_ranges in the database?
If this is the case the warning I get about the wms.product_ranges cannot be ignored. My database is not ok.

So I checked both the core and ows setup, the database tables, materialized views and so on were not created. I had to setup my own user to have local access via UNIX socket to postgres, in the case I need to use another user/password credentials where these credentials have to be stored to be used by the system? After that I re-init the datacube add products and dataset, and datacube-ows-update --role myrole --schema command created the needed tables, the datacube database is still without any relation, is this correct?

I have a agdc namespace were I have some relations. Is this correct?

The ows database contains the postgis views and three materialized views. Is this correct?

Starting again the flask server for datacube ows now gives me other warnings, I can ignore the first, but the other are still range related:

$ flask run --host=0.0.0.0
[2022-11-24 14:29:17,923] [WARNING] Environment variable $AWS_DEFAULT_REGION not set.  (This warning can be ignored if all data is stored locally.)
[2022-11-24 14:29:18,721] [WARNING] get_ranges failed for layer cO3: Null product range
[2022-11-24 14:29:18,723] [WARNING] get_ranges failed for layer cO3new: Null product range
 * Serving Flask app 'datacube_ows/ogc.py'
 * Debug mode: off
[2022-11-24 14:29:18,728] [INFO] WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on all addresses (0.0.0.0)
 * Running on http://127.0.0.1:5000
 * Running on http://192.168.88.86:5000
[2022-11-24 14:29:18,728] [INFO] Press CTRL+C to quit

the warnings saying Null product range, where do I have to define product ranges? In the product metadata yaml file? In my ows_cfg.py file? The errors mention get_ranges function failing.

The GetMap request is returning the same error in the get_times_for_product function @ row 169 of wms_utils.py

Thank you for any answer.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

It looks like you have not run datacube-ows-update --schema against your database.

Please refer to the documentation here: https://datacube-ows.readthedocs.io/en/latest/database.html

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Hi, sorry for the delay of this answer. You're right, I run the datacube-ows-update scripts as this:

datacube-ows-update --schema --role myrole

but this was run inside an initialization script and I missed the error logs.

The error I foun ws in the postgresql setup in my conda environment: the pg custer ws not able to start even if the postgresql service was active, postgresql did not find .s.PGSQL.5432 local unix socket because it was not created. Further more this unix socket was setup in /var/run/postgresql and the datacube ows scripts seem to search for it in /tmp/. Changing accordingly the postgresql.conf and pg_hba.conf files did the trick and pg_cluster started, the unix socket file was created and then datacube-ows-update completed with this log:

(odc) buck@odcdev:~/dev/odc/datacube-ows$ datacube-ows-update --schema --role buck
Single flag bands not in a list is deprecated. Please refer to the documentation for the new format (layer ls8_nbart_albers)
Could not parse layer (ls8_nbart_albers): Could not import python object: datacube_ows.ogc_utils.feature_info_url_template
Single flag bands not in a list is deprecated. Please refer to the documentation for the new format (layer ls8_level1_pds)
wcs section contains a 'default_bands' list.  WCS default_bands list is no longer supported. Functionally, the default behaviour is now to return all available bands (as mandated by the WCS2.x spec). 
wcs section contains a 'default_bands' list.  WCS default_bands list is no longer supported. Functionally, the default behaviour is now to return all available bands (as mandated by the WCS2.x spec). 
Single flag bands not in a list is deprecated. Please refer to the documentation for the new format (layer sentinel2_nrt)
wcs section contains a 'default_bands' list.  WCS default_bands list is no longer supported. Functionally, the default behaviour is now to return all available bands (as mandated by the WCS2.x spec). 
Could not parse layer (mangrove_cover): Required product names entry missing in named layer mangrove_cover
Could not load layer Level 1 USGS Landsat-8 Public Data Set: Could not find product ls8_level1_usgs in datacube for layer ls8_level1_pds
Could not load layer WOfS Summary: Could not find product wofs_summary in datacube for layer wofs_summary
Could not load layer Near Real-Time images from Sentinel-2 Satellites: Could not find product s2a_nrt_granule in datacube for layer sentinel2_nrt
Checking schema....
Creating or replacing WMS database schema...
 Creating/replacing wms schema

 Creating/replacing product ranges table

 Creating/replacing sub-product ranges table

 Creating/replacing multi-product ranges table

 Granting usage on schema

Creating or replacing materialised views...
 Installing Postgis extensions on public schema

 Setting default timezone to UTC

 Creating NEW TIME Materialised View (start of hard work)

 Creating NEW SPACE Materialised View (Slowest step!)

 Creating NEW combined SPACE-TIME Materialised View

 Creating NEW Materialised View Index 1/4

 Creating NEW Materialised View Index 2/4

 Creating NEW Materialised View Index 3/4

 Creating NEW Materialised View Index 4/4

 Renaming old spacetime view (OWS down)

 Renaming new view to space_time_view (OWS back up)

Dropping OLD spacetime view (and indexes)

 Dropping OLD time view

 Dropping OLD space view

 Renaming NEW space_view

 Renaming NEW time_view

 Renaming new Materialised View Index 1/4

 Renaming new Materialised View Index 2/4

 Renaming new Materialised View Index 3/4

 Renaming new Materialised View Index 4/4

 Granting read permission to public

Done

so apart the warnings about the missing layers at the start and those WCS related (I didn't activate WCS in config my file) it seems to have updated ows postgresql views and tables.

Thanks for your help.

Now querying the WMS service in order to get a map about my data I'm facing another issue. The query I make is:

http://192.168.88.86:5000/wms?request=GetMap&crs=EPSG:32632&layers=cO3&bbox=457000,4943000,697000,5175000&width=800&height=600&format=png&service=WMS&version=1.3.0

the WMS answers with this exception:

<ServiceExceptionReport version="1.3.0" xsi:schemaLocation="http://www.opengis.net/ogc http://schemas.opengis.net/wms/1.3.0/exceptions_1_3_0.xsd">
  <ServiceException>
    Unexpected server error: 'NoneType' object is not subscriptable
  </ServiceException>
  <ServiceException>
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/ogc.py, line 152 in ogc_svc_impl> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms.py, line 29 in handle_wms> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/utils.py, line 24 in log_wrapper> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/data.py, line 404 in get_map> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 301 in __init__> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in get_times> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 177 in <listcomp>> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 205 in parse_time_item> 
    <FrameSummary file /home/buck/dev/odc/datacube-ows/datacube_ows/wms_utils.py, line 169 in get_times_for_product>
  </ServiceException>
</ServiceExceptionReport>

that appears related to the way the time variable is stored/declared in my netcdf files, causing, I think, an undefined range.

Thanks again, have a nice day.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

Also, please check the time_resolution for your layer in config. The documentation is here:

https://datacube-ows.readthedocs.io/en/latest/cfg_layers.html#time-resolution-time-resolution

If unsure run this SQL and post the results here.

select * from space_time_view stv
where stv.dataset_type_ref = (
  select id 
  from agdc.dataset_type
  where name='cO3');

(Substitute the product name as needed.)

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

It's hard to say because I don't know where you added the print statement. What does the contents of the space_time_view for datasets in the product look like (SQL in my comment above)?

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Hi Paul,

in my try-and-retry now I have 3 dataset:

datacube=# select id, name from agdc.dataset_type;
 id |      name       
----+-----------------
  1 | c_o3_no_z
  2 | c_o3_pm
  3 | sentinel_5p_no2
(3 rows)

I changed something in the metadata files for product and dataset, mainly the names. If necessary I can attach the metadata files and the netcdf ones I'm currently using.

So the query you suggested should be written this way for layer 2:

datacube=# select * from space_time_view stv
where stv.dataset_type_ref = (
  select id 
  from agdc.dataset_type
  where name='c_o3_pm');
                  id                  | dataset_type_ref |                                                                     
                      spatial_extent                                                                                           
|                   temporal_extent                   
--------------------------------------+------------------+---------------------------------------------------------------------
-------------------------------------------------------------------------------------------------------------------------------
+-----------------------------------------------------
 6a5b0bf9-df03-4115-b89e-7184c74fe66b |                2 | 0103000020E61000000100000005000000F69537F75B272740D9EA74378A5947400E
903EA939F7264084BC32A4824E46403321489466EA2040C975BC14CD5146409FB11658E0DF204007CD9AE9135D4740F69537F75B272740D9EA74378A594740 
| ["2015-04-10 17:54:01+02","2015-04-10 17:54:01+02"]
(1 row)

with this spatial exent:

datacube=# select st_astext(spatial_extent) from space_time_view stv
where stv.dataset_type_ref = (
  select id 
  from agdc.dataset_type
  where name='c_o3_pm');
                                                                                   st_astext                                                         
                           
-----------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------
 POLYGON((11.576873517547 46.6995305367361,11.482861794364 44.6133618591057,8.4578138673829 44.63907107546,8.43725848462128 46.7271701818336,11.57687
3517547 46.6995305367361))
(1 row)

and this way for layer 1

datacube=# select * from space_time_view stv
where stv.dataset_type_ref = (
  select id 
  from agdc.dataset_type
  where name='c_o3_no_z');
                  id                  | dataset_type_ref |                                                                                           
spatial_extent                                                                                           |                   temporal_extent         
          
--------------------------------------+------------------+-------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------+-------------------------------------------
----------
 94cf3426-6200-44c8-b1b2-f4e37e850791 |                1 | 0103000020E61000000100000005000000F69537F75B272740D9EA74378A5947400E903EA939F7264084BC32A4
824E46403321489466EA2040C975BC14CD5146409FB11658E0DF204007CD9AE9135D4740F69537F75B272740D9EA74378A594740 | ["2015-04-10 04:00:00+02","2015-04-11 03:0
0:00+02"]
(1 row)

with this spatial extent:

datacube=# select st_astext(spatial_extent) from space_time_view stv
where stv.dataset_type_ref = (
  select id 
  from agdc.dataset_type
  where name='c_o3_no_z');
                                                                                   st_astext                                                         
                           
-----------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------
 POLYGON((11.576873517547 46.6995305367361,11.482861794364 44.6133618591057,8.4578138673829 44.63907107546,8.43725848462128 46.7271701818336,11.57687
3517547 46.6995305367361))
(1 row)

both spatial and temporal extents seem to have valid values

I put some simple print statements in three files wms_utils.py, resource_limits.py and ogc_utils.py and now removed them. The one producing that output was in resource_limits.py file

print('geobox:', geobox)

at line 54 as the first line of function _standardise_geobox(...):

53     def _standardise_geobox(self, geobox: GeoBox) -> GeoBox:
54         print('geobox:', geobox)
55         if geobox.crs == 'EPSG:3857':
56             return geobox
57         bbox = geobox.extent.to_crs('EPSG:3857').boundingbox
58         return create_geobox(CRS('EPSG:3857'),
59                              bbox.left, bbox.bottom,
60                              bbox.right, bbox.top,
61                              width=geobox.width, height=geobox.height
62                              )

Thank you for your answer.

Have a nice day.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

OK, so what's in wms.product_ranges then?

select * from wms.product_ranges where id = 2

If that comes back empty, you will need to run datacube-ows-update with no options. (Sorry - probably should have spotted that sooner.)

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Hi Paul,

this is psql output for product ranges of product #2:

datacube=# select * from wms.product_ranges where id = 2;
 id |     lat_min      |     lat_max      |     lon_min      |     lon_max     |     dates      |                                                    
                                                                                                                                    bboxes           
                                                                                                                                                     
                        
----+------------------+------------------+------------------+-----------------+----------------+----------------------------------------------------
-----------------------------------------------------------------------------------------------------------------------------------------------------
-----------------------------------------------------------------------------------------------------------------------------------------------------
------------------------
  2 | 44.6133618591057 | 46.7271701818336 | 8.43725848462128 | 11.576873517547 | ["2015-04-10"] | {"EPSG:3857": {"top": 5897654.449820093, "left": 93
9231.3181992678, "right": 1288731.6649514658, "bottom": 5560857.219441153}, "EPSG:4326": {"top": 46.7271701818336, "left": 8.43725848462128, "right":
 11.576873517547, "bottom": 44.6133618591057}, "EPSG:32632": {"top": 5178071.156306333, "left": 455350.0773172115, "right": 704459.3313960176, "botto
m": 4940088.327406288}}
(1 row)

when the wms schema was created in the right database I've checked this table and get suitable results querying for product ranges.

Thanks for your answer, have a good day.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

Add &ows_stats=y to the end of your GetMap query. You should get a short json document back, post it here.

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Hi @SpacemanPaul,
thank you for your answer and hint that was very usefull because seems that no dataset is returned after my request. As you suggested, adding the ows_stats GET parameter gives me this raw json response:

{
  "profile": 
  {
    "query": 0.04235363006591797, 
    "count-datasets": 0.012956857681274414,
    "write": 0.019739866256713867
  }, 
  "info": 
  {
    "n_dates": 1, 
    "zoom_factor": 15.799040853901776,
    "n_datasets": 0, 
    "zoom_level_base": 4.370864517607939,
    "zoom_level_adjusted": 9.756455172359999,
    "datasets": {
      "Query bands {'c_pm', 'c_o3'} from products [Product(name='c_o3_pm', id_=2)]": []
    },
    "write_action": "No datasets: Write Empty"
  }
}

from which I get that

  • no dataset was found from my request
  • also the bands array is empty
  • and finally the write_action value tells me about an empty write, but I do not know its meaning.

I then checked the datacube database to browse for datasets:

datacube=# select id, metadata_type_ref, dataset_type_ref from agdc.dataset;
                  id                  | metadata_type_ref | dataset_type_ref 
--------------------------------------+-------------------+------------------
 94cf3426-6200-44c8-b1b2-f4e37e850791 |                 1 |                1
 6a5b0bf9-df03-4115-b89e-7184c74fe66b |                 1 |                2
 50000000-0000-0000-0000-202205051041 |                 1 |                3
(3 rows)

the metadata column for dataset 6a5b0bf9-df03-4115-b89e-7184c74fe66b contains this json value:

{
  "id": "6a5b0bf9-df03-4115-b89e-7184c74fe66b",
  "crs": "epsg:32632",
  "grids": {
    "default": {
      "lat": {
        "type": "double-range",
        "max_offset": [[232000, 5175000, "end"]],
        "min_offset": [[232000, 4943000, "begin"]],
        "BoundingBox": [457000, 4943000, 697000, 5175000],
        "description": "Latitude range"
      },
      "lon": {
        "type": "double-range", 
        "max_offset": [[240000, 697000,"end"]], 
        "min_offset": [[240000, 457000, "begin"]], 
        "description": "Longitude range"
      },
      "shape": [59, 61],
      "transform": [4000.0, 0.0, 455000.0, 0.0, -4000.0, 5177000.0, 0.0, 0.0, 1.0], 
      "spatial_reference": "PROJCS[\"WGS 84 / UTM zone 32N\",GEOGCS[\"WGS 84\" ,DATUM[\"WGS_1984\",SPHEROID[\"WGS84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",9],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32632\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]"
      }
    },
    "extent": {
      "lat": {
        "end": 46.72717018183362, 
        "begin": 44.61336185910571
      },
      "lon": {
        "end": 11.57687351754701,
       "begin": 8.437258484621278
      }
    },
    "$schema": "https://schemas.opendatacube.org/dataset",
    "lineage": {
      "source_datasets":{}
    },
    "product": {"name": "c_o3_pm"},
    "geometry": {
      "type": "Polygon",
      "coordinates": [[[697000.0, 5175000.0], [697000.0, 4943000.0], [457000.0, 4943000.0], [457000.0, 5175000.0], [697000.0, 5175000.0]]]
    },
    "properties": {
      "datetime": "2015-04-10 15:54:01+00:00",
      "eo:platform": "arpa",
      "eo:instrument": "WAISALA", 
      "odc:file_format": "NetCDF",
      "odc:processing_datetime": "2022-05-12T18:02:03.926659"
    },
    "grid_spatial": {
      "projection": {
        "valid_data": {
          "type": "Polygon", 
          "coordinates": [[[697000.0, 5175000.0], [697000.0, 4943000.0], [457000.0, 4943000.0], [457000.0, 5175000.0], [697000.0, 5175000.0]]]
        },
        "geo_ref_points": {
          "ll": {
            "x": 455000.0,
            "y": 4941000.0
          },
          "lr": {
            "x": 699000.0,
            "y": 4941000.0
          },
         "ul":{
           "x": 455000.0,
           "y": 5177000.0
         },
         "ur": {
           "x": 699000.0,
           "y": 5177000.0
        }
      },
      "spatial_reference": "epsg:32632"
    }
  },
  "measurements": {
    "c_o3": {
      "path": "file:///home/buck/dev/odc/data/netcdf/twovars_time_1.nc",
      "layer": "c_O3"
    },
    "c_pm":{
      "path": "file:///home/buck/dev/odc/data/netcdf/twovars_time_1.nc",
      "layer": "c_PM"
    }
  }
}

and it seems that 3 datasets are present in the database, as per the wms request the relevant dataset should be the second (6a5b0bf9-df03-4115-b89e-7184c74fe66b |1 |2) so I issued a datacube dataset info for dataset 6a5b0bf9-df03-4115-b89e-7184c74fe66b that gave me this

(odc) buck@odcdev:~/dev/odc/datacube-ows$ datacube dataset info 6a5b0bf9-df03-4115-b89e-7184c74fe66b
 buck None 5432 datacube-dataset-info agdc-1.8.8 True False
id: 6a5b0bf9-df03-4115-b89e-7184c74fe66b
product: c_o3_pm
status: active
indexed: 2022-12-17 19:55:16.175248+01:00
locations:
- file:///home/buck/dev/odc/data/netcdf/twovars_dataset.odc-metadata.yaml
fields:
    cloud_cover: null
    creation_time: 2022-05-12 18:02:03.926659
    dataset_maturity: null
    format: NetCDF
    instrument: WAISALA
    label: null
    lat: {begin: 44.61336185910571, end: 46.72717018183362}
    lon: {begin: 8.437258484621278, end: 11.57687351754701}
    platform: arpa
    product_family: null
    region_code: null
    time: {begin: '2015-04-10T15:54:01+00:00', end: '2015-04-10T15:54:01+00:00'}
(odc) buck@odcdev:~/dev/odc/datacube-ows$ 

from which I get that the dataset is active, it points to the correct yaml file (I must check then if this file is really correct ...), the bounding box seems ok and also the time extent (1 value) seems to match the netcdf contents. I do not know if some of the null values in this datacube command output are meaningful.

I guess that my dataset is not well indexed ... the datacube is not able to find what I think should be the data bands, or my ows_cfg.py does not define correctly the bands.

Thanks again for your answer, have a nice day.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

OK:
From your metadata:

 "geo_ref_points": {
          "ll": {
            "x": 455000.0,
            "y": 4941000.0
          },
          "lr": {
            "x": 699000.0,
            "y": 4941000.0
          },
         "ul":{
           "x": 455000.0,
           "y": 5177000.0
         },
         "ur": {
           "x": 699000.0,
           "y": 4941000.0
        }
      },
      "spatial_reference": "epsg:32632"

So the coordinate ranges are:

X: 455000.0 - 699000.0,
Y: 4941000.0 - 5177000.0

Your WMS bbox query (from above) was:

bbox=5178071.156306333,%20455350.0773172115,%20704459.3313960176,%204940088.327406288

Removing the spaces (%20) and expanding as minx, miny, maxx, maxy we have the query:

X: 5178071. - 704459.0
Y: 455350 - 4940088

Which does look it should (just) overlap.

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

But there's something going wrong with the time stamps.

Your metadata has:

"datetime": "2015-04-10 15:54:01+00:00",

But strangely you say space_time_view has:

["2015-04-10 17:54:01+02","2015-04-10 17:54:01+02"]

As far as I can see it should at least be ["2015-04-10 15:54:01+00:00", "2015-04-10 15:54:01+00:00"] although that shouldn't stop it working.

You still haven't advise what time_resolution configuration you are using for the layer - this may be the problem.

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

from datacube-ows.

SpacemanPaul avatar SpacemanPaul commented on June 25, 2024

Great to hear Luca. I'll close this ticket, if it's OK with you. Feel free to open another if you get stuck again.

from datacube-ows.

lucapaganotti avatar lucapaganotti commented on June 25, 2024

Yes, I agree, now I need to "clean" my environment so to have a good starting point for further exploring the datacube services.
Thanks again for your support.

from datacube-ows.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.