Code Monkey home page Code Monkey logo

Comments (7)

dnerini avatar dnerini commented on June 9, 2024

Hi @5fcgdaeb

I'm not completely sure that I've properly understood your request, so please let me know if I was somewhat mistaken.

pysteps currently supports the data formats of three national radar composites (Switzerland, Finland and Australia) plus the ODIM format used by the OPERA composite.

The main point is that these data clearly specify their metadata (for example the timestamp and geolocalization of their grid) while allowing to read the data into their physical units (typically rain intensity, depth or radar reflectivity). So, it is not just about the format itself (GIF, netCDF or HDF), but mainly about the data information model: we need to know how to interpret each pixel in the composite image (what, when and where).

The images that you have provided in your post do not represent data that we can directly use, but rather just their visualization. In other words, we could read for each of those pixel for example their RGB value, but we wouldn't know how to convert it into mm of rainfall, nor we wouldn't know to what time and location it refers to. Moreover, the last two examples you have provided also include a basemap and other geographical information that mask and mix with the actual rainfall data.

I hope this address your issue.

from pysteps.

5fcgdaeb avatar 5fcgdaeb commented on June 9, 2024

Hi @dnerini ,

You are spot on, that is what I was asking! Thanks for the detailed info.

I now understand that if I want to nowcast using my images, I will need to provide my own importer. In other words, I need to provide the appropriate metadata of my images. The metadata is well defined in your documentation at: https://pysteps.github.io/pysteps/refmanual/io.html#pysteps-io-importers

The follow-up questions would be:

  • Is there anywhere where I can find more details about the fields of metadata listed in the link above? I am lacking the meteorological background to fully understand those details. Are all the fields mandatory or any of them are used for documentation/display purposes?
  • It is very clear to me that the library needs to know the physical units (is it mm/h or DbZ on the image) or the color code mapping (which color maps to which intensity level). I am just curious why the library needs the time and coordinate information. Does the model work differently on different regions of the world? Or does the advection/flow depend on the hour of the day?

Thanks!

from pysteps.

dnerini avatar dnerini commented on June 9, 2024

Exactly, what really pysteps needs is the relevant metadata associated to your own data.
I would suggest you to use one of the already existing importers as an example for your own one. In this sense, the most straightforward is probably the mch_gif importer, since most of its metadata are simply hardcoded in the importer.

You make a very good point, as none of the actual prediction method in pysteps needs metadata in order to run, strictly speaking. Coordinates and timestamps are needed mainly for visualization (and some data manipulation). The units are important for the pre and post processing steps that are needed in order to normalize the precipitation data before you can start the nowcast. But apart from that, the optical flow and advection routines basically can work with any sequence of images.

So, to come back to your question, the metadata that you'll need to specify depend very much on what you want to do and consequently on the methods that your are going to use.

from pysteps.

dnerini avatar dnerini commented on June 9, 2024

I'm closing this issue, @5fcgdaeb. Please feel free to re-open it in case you have any more questions concerning the import of radar data and metadata.

from pysteps.

5fcgdaeb avatar 5fcgdaeb commented on June 9, 2024

Sounds great @dnerini, thanks a lot for your support. Let me give the custom importer a shot and I can re-open accordingly.

from pysteps.

g0lemXIV avatar g0lemXIV commented on June 9, 2024

Hello, I have the same problem too. I want to implement PySteps in one nowcasting system but I don't fully understand the convention in the metadata dictionary. That why I have a few questions:

  1. How to interpret x1, y1, x2, y2 in metadata
  2. How to interpret yorgin
  3. How can I make a 'projection' in the dictionary
  4. What are the threshold and zero-value

In my raw radar data images I have a header with:
Lat/Long, Azimuth, Elevation
and in observed data:
Rain, Zhh, V and Quality Information

Thanks a lot for help

from pysteps.

dnerini avatar dnerini commented on June 9, 2024

Hi @g0lemXIV and welcome to pysteps!

You can find a description of the metadata in the pysteps reference documentation.
Please let us know if you find the information in there not clear enough or incomplete so that we can improve it!

To answer your questions:

  1. How to interpret x1, y1, x2, y2 in metadata

x1, y1, x2, y2 define the bounding box of your domain in data coordinates (meters):

  • (x1, y1) is the lower left corner
  • (x2, y2) is the upper right corner

So, for example, if you need to compute the x-coordinates of your grid, that would be
xcoord = np.arange(x1, x2, xpixelsize) + xpixelsize/2.

  1. How to interpret yorgin

This is the same as for the "origin" parameter in matplotlib.pyplot.imshow, namely whether to place the [0,0] index of the 2D array in the upper left or lower left corner of the axes.

  1. How can I make a 'projection' in the dictionary

"projection" is a string that follows the PROJ (formely PROJ4) conventions for coordinate transformation.

In your case, it looks like your are using radar data in polar coordinates. In order to use the pysteps routines, you first need to process your data into a 2D cartesian QPE (quantitative precipitation estimate) at the surface. This can be done using open-source libraries for radar data processing and QPE such as wradlib.

  1. What are the threshold and zero-value

The threshold is the lowest measurable precipitation rate/depth or dBZ value included in your dataset. Any value below the threshold is considered as dry.
You can compute it as threshold = np.nanmin(precip[precip > np.nanmin(precip)]).

The zerovalue is the value assigned to non-rainy pixels. Most naturally, this would be 0 if you work in mm of rainfall, but for logarithmic units (e.g., dBZ) you need to define this value.
In practice, you extract this value from your data as simply zerovalue = np.nanmin(precip).

As already mentioned in one of my answers above, to implement a pysteps-compatible data reader, it is probably a good idea to have a look at existing methods.

I hope this helps. Don't hesitate to ask more question on the subject if need and, for a more immediate exchange, I just sent you an invitation to our slack channel where you can ask for support if you like.

Have fun with pysteps!

from pysteps.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.