Code Monkey home page Code Monkey logo

Comments (9)

taylor13 avatar taylor13 commented on August 21, 2024

some questions for clarification:

  1. If you ignore the warning, are you already able to do what you want?
  2. If yes, then I guess you're just asking whether we can suppress the warning (for this particular user-defined attribute). right?
  3. If you add attributes to a regular variable, can CMOR handle it? (It should be able to without raising a warning.)
  4. Is there a need to handle compression for vector coordinate variables or just 2-d grids?
    thanks,
    Karl

from cmor.

durack1 avatar durack1 commented on August 21, 2024

@wachsylon I'd also add the question, what extra deflation/performance gain does this give you? My guess, it would be very marginal. I also wonder whether software packages reading netcdf can handle this out of the box (if it's all managed by the netcdf library) or whether this is another aspect that needs to be considered?

from cmor.

wachsylon avatar wachsylon commented on August 21, 2024

Hi,

first of all: for my specific case I found out that the compression of the 2d longitudes and latitudes can actually be set in the tables at this stage in the grids.json table because these grid lons and lats are listed under variable_entry.

If you ignore the warning, are you already able to do what you want?

no. For variables, it is possible to define the compression in the cmor-tables which are read from CMOR and accordingly set for the output. However, this is not possible for coordinate variables which are listed under axis_entries instead of variable_entries in the tables.

If you add attributes to a regular variable, can CMOR handle it? (It should be able to without raising a warning.)

Yes, that works.

Is there a need to handle compression for vector coordinate variables or just 2-d grids?

I hope not but maybe on small scale. I fear that sometimes chunks of data can only be saved for a small period of time (e.g. monthly data for only one year), where 2d grids can make up a big part of the file. If you have 30 years, this of course ends up in a long list of files which nobody really should want - but maybe sometimes data provider have to save it like that?

Anyway, in my small use case - a dataset, 12 time steps, 2d lons and lats, 1d orginal grid (e.g. rlon and rlat for rotated pole projection) - the data size can be reduced from 8mb to 2mb all because of the grid compression.

Btw, newly high res ICON and IFS raw output is saved without the grid at all. Is that something CMIP7 should also think about? Providing external grid descriptions?

I also wonder whether software packages reading netcdf can handle this out of the box (if it's all managed by the netcdf library) or whether this is another aspect that needs to be considered?

Good point. I can check for some cases and a python workflow.

from cmor.

taylor13 avatar taylor13 commented on August 21, 2024

I want to be sure I understand your "small use case". Is the following correct:

the variable of interest: 12 2d fields
the latitude of each grid cell: 1 2d field
the longitude of each grid cell: 1 2d field
the latitude bounds (presuming each cell has 4 corners): 4 2d fields
the longitude bounds (presuming each cell has 4 corners): 4 2d fields
Some smaller 1d coordinate axes (which aren't to be compressed).

You say the size reduction is attributable to the grid compression alone, but how can that be since the 12 2d fields of the variable must occupy more space than the 10 2d fields associated with the grid? The best you could hope for is reducing the size to 8 mb x 10/22 = 3.6 mb.

I'm obviously missing something obvious (or misunderstanding). Could you explain how you get down to 2 mb my compressing the grids alone?

from cmor.

wachsylon avatar wachsylon commented on August 21, 2024

I'm obviously missing something

I cannot reproduce my numbers and I guess, I just was not precise.

Your calculation fits perfectly to what I find in the data now and I do not really understand why. The compression of data depends on its values, doesn't it? So how is there a formula at all?

What I would understand: since I only compress 10 of 22 fields, you assume that I can only reduce this part of the file size, correct? That would mean I reduce the size from 8mb to 8mb minus 8mb*10/22, correct?

But the variable of interest is already compressed in both variants i.e. also in the 8mb. And since its fields can likely be compressed better (thinking of a constant field, e.g. a lot on NaNs), I assumed that it can happen that a high percentage of the 8mb is only the grid. So all in all, now I am a bit lost.

Nevertheless, it is a significant reduction in size and therefore it can be worth it. Also, I think that the change in code is really small as the function for checking compression is already there and only has to be called at another point.

from cmor.

durack1 avatar durack1 commented on August 21, 2024

@wachsylon, if I were going to build a test suite for this, I would choose 3 variables: tas (global coverage, no mask), tos (global coverage, land ~30% masked), and siconc (global coverage, land ~30% masked, and ~90% of the remaining cells should be missing values).

If you find that across those examples, you get a really good deflation improvement, AND, generic software libraries can read these compressed data with no problems, then it would make sense to add this to the CMOR4 planning items

from cmor.

durack1 avatar durack1 commented on August 21, 2024

@wachsylon we have a couple of updates planned for CMOR3.9.0, including exposing netCDF quantize (~4.9.x new function), not sure compressing coordinates makes sense, but we can revisit as the release is being prepared

from cmor.

cofinoa avatar cofinoa commented on August 21, 2024

This relates to #601 and #733.

Activating chunking on coordinates can benefit of compression ratio, specially for large 2D coordinates.

Just enabling chunking and adding deflate, and/or other filters from the standard netcdf-c library doesn't affect netCDF4-classic compliance for files, and for the software reading the produced files, the netcdf library makes this transparent.

from cmor.

durack1 avatar durack1 commented on August 21, 2024

Some of the chatter in #725 is starting to suggest that compression for coordinates is going to cause problems, so this request will likely be closed with that justification?

@taylor13 @cofinoa @wachsylon any comments before this is done?

from cmor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.