Code Monkey home page Code Monkey logo

slcs-jsc / mptrac Goto Github PK

View Code? Open in Web Editor NEW
35.0 11.0 14.0 803.58 MB

Massive-Parallel Trajectory Calculations (MPTRAC) is a Lagrangian particle dispersion model for the analysis of atmospheric transport processes in the free troposphere and stratosphere.

License: GNU General Public License v3.0

Shell 6.34% Makefile 1.13% C 90.92% Python 1.60%
atmospheric-modelling dispersion-model stratosphere trajectories dispersion atmospheric-science climate climate-science troposphere meteorology

mptrac's Introduction

Massive-Parallel Trajectory Calculations

Massive-Parallel Trajectory Calculations (MPTRAC) is a Lagrangian particle dispersion model for the analysis of atmospheric transport processes in the free troposphere and stratosphere.

logo

release (latest by date) commits since latest release (by SemVer) last commit top language code size in bytes codacy codecov tests docs license doi

Features

  • MPTRAC calculates air parcel trajectories by solving the kinematic equation of motion using given horizontal wind and vertical velocity fields from global reanalyses or forecasts.
  • Mesoscale diffusion and subgrid-scale wind fluctuations are simulated using the Langevin equation to add stochastic perturbations to the trajectories. A new inter-parcel exchange module represents mixing of air.
  • Additional modules are implemented to simulate convection, sedimentation, exponential decay, gas and aqueous phase chemistry, and wet and dry deposition.
  • Meteorological data pre-processing code provides estimates of the boundary layer, convective available potential energy, geopotential heights, potential vorticity, and tropopause data.
  • Various output methods for particle, grid, ensemble, profile, sample, and station data. Gnuplot and ParaView interfaces for visualization.
  • MPI-OpenMP-OpenACC hybrid parallelization and distinct code optimizations for efficient use from single workstations to HPC and GPU systems.
  • Distributed open source under the terms of the GNU GPL.

Getting started

Prerequisites

This README file describes how to install MPTRAC on a Linux system.

The following software dependencies are required to compile MPTRAC:

The following optional software is required to enable additional features of MPTRAC:

Some of the software is provided along with the MPTRAC repository, please see next section.

Installation

Start by downloading the latest or one of the previous MPTRAC releases. Unzip the release file:

unzip mptrac-x.y.zip

Alternatively, you can get the development version of the software from the GitHub repository:

git clone https://github.com/slcs-jsc/mptrac.git

Several libraries shipped along with MPTRAC can be compiled and installed by running a build script:

cd [mptrac_directory]/libs
./build.sh -a

Then change to the source directory and edit the Makefile according to your needs:

cd [mptrac_directory]/src
emacs Makefile

In particular, you may want to check:

  • Edit the LIBDIR and INCDIR paths to point to the directories where the GSL, netCDF, and other libraries are located on your system.

  • By default, the MPTRAC binaries are linked statically, i.e., they can be copied and used on other machines. However, sometimes static compilation causes problems, e.g., in combination with dynamically compiled GSL and netCDF libraries or when using MPI or OpenACC. In this case, disable the STATIC flag and remember to set the LD_LIBRARY_PATH to include the paths to the shared libraries.

  • To make use of the MPI parallelization of MPTRAC, the MPI flag must be enabled. Further steps will require an MPI library such as OpenMPI to be available on your system. To make use of the OpenACC parallelization, the GPU flag must be enabled. The NVIDIA HPC SDK is required to compile the GPU code. MPTRAC's OpenMP parallelization is always enabled.

Next, try compiling the code:

make [-j]

To run the test cases to check the installation, use

make check

This will run a series of tests sequentially. It will stop if any of the tests fail. Please check the log messages.

Run the example

An example is provided to illustrate how to simulate the dispersion of volcanic ash from the eruption of the Puyehue-Cordón Caulle volcano, Chile, in June 2011.

The example can be found in the projects/example/ subdirectory. The projects/ subdirectory can also be used to store the results of your own simulation experiments with MPTRAC.

The example simulation is controlled by a shell script:

cd mptrac/projects/example
./run.sh

See the run.sh script for how to invoke MPTRAC programs such as atm_init and atm_split to initialize the trajectory seeds and trac to compute the trajectories.

The script generates simulation output in the examples/data subdirectory. The corresponding reference data can be found in examples/data.ref.

A set of plots of the simulation output at different time steps after the eruption, generated by means of the gnuplot plotting tool, can be found in examples/plots. The plots should look similar to the output provided in examples/plots.ref.

This is an example showing the particle positions and grid output on 6th and 8th of June 2011:

Further information

These are the main scientific publications that provide information about MPTRAC:

  • Hoffmann, L., Baumeister, P. F., Cai, Z., Clemens, J., Griessbach, S., Günther, G., Heng, Y., Liu, M., Haghighi Mood, K., Stein, O., Thomas, N., Vogel, B., Wu, X., and Zou, L.: Massive-Parallel Trajectory Calculations version 2.2 (MPTRAC-2.2): Lagrangian transport simulations on graphics processing units (GPUs), Geosci. Model Dev., 15, 2731–2762, https://doi.org/10.5194/gmd-15-2731-2022, 2022.

  • Hoffmann, L., T. Rößler, S. Griessbach, Y. Heng, and O. Stein, Lagrangian transport simulations of volcanic sulfur dioxide emissions: Impact of meteorological data products, J. Geophys. Res. Atmos., 121, 4651-4673, https://doi.org/10.1002/2015JD023749, 2016.

Additional references are collected on the references web page.

More detailed information for users of MPTRAC is provided in the user manual.

Information for developers of MPTRAC can be found in the doxygen manual.

Contributing

We are interested in supporting operational and research applications with MPTRAC.

You can submit bug reports or feature requests on the issue tracker.

Proposed code changes and fixes can be submitted as pull requests.

Please do not hesitate to contact us if you have any questions or need assistance.

License

MPTRAC is being developed at the Jülich Supercomputing Centre, Forschungszentrum Jülich, Germany.

MPTRAC is distributed under the terms of the GNU General Public License v3.0.

Please see the citation file for more information about citing the MPTRAC model in scientific publications.

Contact

Dr. Lars Hoffmann

Jülich Supercomputing Centre, Forschungszentrum Jülich

e-mail: [email protected]

mptrac's People

Contributors

bolarinwasaheed avatar codacy-badger avatar fkhosrawi avatar holke avatar hydrogencl avatar janhclem avatar jonassonnabend avatar kaveh01 avatar laomangio avatar lars2015 avatar sabinegri avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mptrac's Issues

update of meteo binary data files

Is your feature request related to a problem?

  • Some additional variables (ul, vl, wl, o3c, ...) have recently been added to the meteo pre-processing code and meteo data structs of MPTRAC. These data should be added to the meteo binary files as they would otherwise be missing during the simulations.

Describe the solution you'd like

  • Double check whether all the variables of meteo data struct met_t in libtrac.h are considered for input/output in read_met() and write_met() in libtrac.c. Add the missing variables.

Additional context

  • As the binary file format will change, a new version number should be assigned.

visualization of particle data

Is your feature request related to a problem?

  • looking for different methods for visualization of MPTRAC particle data

Describe the solution you'd like

  • identify different visualization tools suitable for displaying the particle data in a more fancy way (3-D plots, animations, ...)
  • possibly implement new I/O functions in MPTRAC for writing/converting particle data in file formats best suited for the visualization tools

Describe alternatives you've considered

  • particle data output is currently provided in the form of ASCII tables for use with gnuplot
  • particle data can also be written as netCDF files for use with other visualization tools, python, ...

bug in netCDF grid output?

Hi Mingzhao,

there is a bug in the netCDF grid output, I think.

When you use strcat() to build the netCDF variable names, it will overwrite the original data of the ctl struct. So the code crashes here with a netCDF error message:

NC_PUT_DOUBLE(strcat(ctl->qnt_name[iq], "_mean"), help, 0);

You need to create a new string to make this work, e.g.

char varname[LEN];
sprintf(varname, "%s_mean", ctl->qnt_name[...]);
NC_PUT_DOUBLE(varname, help, 0); 

Can you please take a look?

Thanks
Lars

Extend the possible interpolation methods for different grids (mostly ICON grid data)

We already have interpolation functions and data structures for cartesian grids. Could we extend the interpolation functions to deal with icosahedral grids?

  • Literature research: Can libraries be used for it?
  • What routines can be shared with LaMETTA, YAC, ICON?
  • Implement an icosahedral met grid type.
  • Write a reading function for ico met type.
  • Implement index search functions for the ICON grid.
  • Implement a horizontal interpolation from the ICON grid to air parcel positions...
  • Implement a horizontal interpolation from the ICON grid to the lat-lon support grid, which is required for parameterizations.
  • Formulate a hybrid coordinate with potential temperature in the stratosphere and a non-hydrostatic coordinate in the troposphere.
  • Generalize the vertical coordinate system in MPTRAC.
  • Implement exact interpolations for winds on pressure levels that consider the tilt of the levels against the eta coordinate.
  • Implement the vertical interpolation function and combine it with the horizontal interpolations.
  • Implement a control flow to select between cartesian, icosahedral or any other potential grid geometry.
  • Test the procedures with ICON output data...

modifications of library build scripts

Suggestions to modify the library build script:

  • Limit the set of libraries built with the default option/selection to those that are mandatory for building the model (i.e. GSL, HDF5, netCDF, and zlib). Optional libraries (thrustsort, zfp, zstd, KPP) should only be compiled if the users selects them. This avoids initial problems installing MPTRAC if the library build processes are failing. (suggested by @laomangio)

  • Consider integrating the libraries as git submodules rather than adding the source packages to the MPTRAC git repository. (suggested by @hydrogencl )

list of case studies for MPTRAC

make -j4

use nc2

atm_dist.c:47:3: optimized: Inlining printf/47 into main/185 (always_inline).
/usr/bin/ld: cannot find -lhdf5_hl
/usr/bin/ld: cannot find -lhdf5
collect2: error: ld returned 1 exit status
make: *** [Makefile:142: atm_select] Error 1
/usr/bin/ld: cannot find -lhdf5_hl
/usr/bin/ld: cannot find -lhdf5
collect2: error: ld returned 1 exit status
make: *** [Makefile:142: atm_init] Error 1
/usr/bin/ld: cannot find -lhdf5_hl
/usr/bin/ld: cannot find -lhdf5
collect2: error: ld returned 1 exit status
make: *** [Makefile:142: atm_dist] Error 1
image

Testing with different compilers and compiler versions

MPTRAC test cases are not fully reproducible with different compilers and compiler versions.

compiler compiler version libraries operating system test system test result note
gcc 13.2.0 provided by OS Ubuntu 24.04 GitHub Actions FAILED (a)
gcc 12.3.0 shipped with MPTRAC Stage 2024 JWC login OK
gcc 12.3.0 shipped with MPTRAC Stage 2024 JWC compute OK
gcc 12.3.0 shipped with MPTRAC Stage 2024 JRC login OK
gcc 12.3.0 shipped with MPTRAC Stage 2024 JRC compute OK
gcc 11.4.0 shipped with MPTRAC Ubuntu 22.04 Notebook OK
gcc 11.4.0 provided by OS Ubuntu 22.04 GitHub Actions OK
nvc 24.1.0 shipped with MPTRAC Ubuntu 22.04 Notebook FAILED (a)
nvc 23.7.0 shipped with MPTRAC Stage 2024 JWB GPU OK (b)
clang 14.0.0 shipped with MPTRAC Ubuntu 22.04 Notebook FAILED (a)

(a) small numerical differences in GSL statistics functions (stddev, skew, kurt), possibly due to GSL results not being reproducible when GSL is compiled with different versions of gcc? also other (small) differences

(b) GPU test is conducted with make gpu_test rather than make check

improve documentation for new users

Is your feature request related to a problem?

  • new users need more detailed documentation to get started with MPTRAC

Describe the solution you'd like

  • check/revise the README file
  • revise and extend the mkdocs manual, including descriptions of individual apps/tools
  • add/check links to references and other sources of information
  • revise and extend the doxygen manual

Additional ideas

  • implement `-help' flags in the apps
  • create and add small logo for MPTRAC on doxygen and mkdocs manual

./build.sh nc4

after ./build.sh nc4,there are some errors, but ./build.sh nc2 is no problem

***** 12 FAILURES! *****
Command exited with non-zero status 1
0.34user 0.29system 0:00.62elapsed 103%CPU (0avgtext+0avgdata 4236maxresident)k
0inputs+0outputs (0major+18632minor)pagefaults 0swaps
Makefile:3451: recipe for target 'dt_arith.chkexe_' failed
make[4]: *** [dt_arith.chkexe_] Error 1
make[4]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
Makefile:3436: recipe for target 'build-check-s' failed
make[3]: *** [build-check-s] Error 2
make[3]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
Makefile:3431: recipe for target 'test' failed
make[2]: *** [test] Error 2
make[2]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
Makefile:3207: recipe for target 'check-am' failed
make[1]: *** [check-am] Error 2
make[1]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
Makefile:715: recipe for target 'check-recursive' failed
make: *** [check-recursive] Error 1
root@LAPTOP-9C6L8HRA:/usr/local/meteo/mptrac/libs# cd build/
image

Integration of Fortran wrapper

Overview

  • A Fortran wrapper for MPTRAC has been developed in the context of the natESM LAGOON sprint.
  • The code for the wrapper has been developed separately from the MPTRAC git repo, but should be integrated in the master branch.

Open tasks

  • fix remaining issues with wrapper code:
    • fix memory issue when using the wrapper on a notebook (Ubuntu) rather than JUWELS
    • handling of pointer-to-pointer in get_met() function
    • use of argc/argv command line arguments in the wrapper
  • adapt Makefile for compiling the wrapper:
    • add a flag WRAPPER to switch on/off the compilation of the Fortran wrapper (if needed)
    • allow for selection of the Fortran compiler (gfortran, nvfortran)
  • improve test case:
    • check that Fortran wrapper code uses the same dimensions as the C code (EX, EX, EP, NP)
    • print selected variables from ctl_t, atm_t, met_t in trac_fortran to a logfile, compare logfile with reference output to make sure C-to-Fortran mapping of variables/arrays is correct
  • documentation:
    • add markdown page to user/developer manual describing the wrapper
    • add doxygen documentation to Fortran wrapper and any additional/modified C code

Python code examples

Is your feature request related to a problem?

  • Provide some simple examples of python scripts helping new users to work with MPTRAC simulation output.

Describe the solution you'd like

  • Examples should show how to read atm- and grid-files in ASCII file format and make map plots of the data.
  • Put Python code examples in a new directory ./projects/python

Additional context

  • Add some context on python usage to the mkdocs documentation?

build and make issue

Hi, i have met some issues when i install MPTRAC onto server linux system.
Describe the bug

  • sometimes it succeeded after running a build script, but sometimes it failed with same issue as #4
succeed

bug1

  • when it succeeded, i tried to compile and check the installation, but it failed with following errors, more details can be seen in attachment. The errors show some issues with varname and longname size, but it still failed after changing the varname[LEN] and longname[LEN] in libtrac.h.
bug2
  • The errors show some issues with varname and longname size, but it still failed after changing the varname[LEN] and longname[LEN] (from 5000 to 5100) in libtrac.h.
bug3

Environment
i use the Rocky Linux server
envs

log_check.txt

Unable to read in variables - ERA5

  • I downloaded surface variables corresponding to this selection - https://apps.ecmwf.int/data-catalogues/era5/?class=ea&stream=oper&expver=1&type=an&year=2005&month=jun&levtype=sfc - at a resolution of 0.3x0.3
  • ML level data was also downloaded at the same resolution and the two were emerged in python (as the two files had different GRIB editions and CDO was apparently not able to handle that)
  • Renamed u10 as u10m and modified attribute GRIB_cfVarNam to store the same, (as a try to address the 'invalid name' issue that followed); similarly for v10m and t2m
  • LINUX output/ error as follows:
  Read meteo data: data_in/ea_2005_06_20_00.nc
  Read meteo grid information...
  Time: 172540800.00 (2005-06-20, 00:00 UTC)
  Number of longitudes: 434
  Number of latitudes: 217

Error (libtrac.c, read_met_grid, l3578): NetCDF: Invalid dimension ID or name

Could you please help me debug? Thank you.

Run MPTRAC within the MESSy environment

Use the MPTRAC library to run trajectories as a replacement for the CLaMS trajectory module.

  • Install MESSy/CLaMS and set up a test case with a climatological run.
  • Identify potential "access points" to load functions of the MPTRAC library and to replace code from CLaMS
  • Understand how to compile and use libraries in MESSy
  • Implement converter functions from CLaMS to MPTRAC and vice versa.
  • Implement functions to read the config for MPTRAC, either as name lists or by giving the pathway to the config.ctl file to the name list of CLaMS.
  • Test using the library in MESSy/CLaMS by using read_ctl
  • ... then by using get_met
  • ... and so on ...
  • When all components can be used run the entire script and intercompare results with the default run.

assessment of simulated diffusion

Collection of ideas for assessing the diffusion schemes in MPTRAC:

  • check for dependencies of simulated diffusion on the time step of the model data or the meteo data

  • estimates of vertical diffusivities from aircraft measurements (Wagner et al., 2017)

  • look for diffusivity coefficients applied in Eulerian models? literature review?

  • compare diffusion of isentropic and kinematic trajectories

  • look at wind fluctuations from ESA Aeolus measurements: https://earth.esa.int/eogateway/missions/aeolus

Build a C to Fortran Interface for the Library

C to Fortran Interface for the Library...

  • Define structs in the interface
  • Define functions in the interface
  • Adapt the Makefile to compile everything correctly ...
  • Implement high-level functions in Fortran to initialize, run and finalize MPTRAC
  • Write kind of a trac.f90 as a Fortran time loop to run a test case...
  • Write tests for the interface / establish workflow that makes interface errors transparent...

improve library build script

Is your feature request related to a problem?

  • the library build script (libs/build.sh) can be improved by additional functionality

Describe the solution you'd like

  • add a command line argument so that the number of threads for parallel builds can be selected (`make -j')
  • add command line arguments so that the user can select which libraries to build (to avoid time-consuming builds of GSL or netCDF, if the software is already available)
  • add an option to download the tarballs of the libraries from their respective sources from the web rather than including them in the MPTRAC git repository?
  • perhaps implement this as a Makefile rather than a bash script?

Additional context

  • pull request by Yen-Sen: #14

Performance degradation with Stage 2024 on JUWELS Booster?

Describe the bug

MPTRAC nightly builds (https://datapub.fz-juelich.de/slcs/mptrac/nightly_builds/) show that the runtime of the physics timers on JUWELS Booster significantly increased on 3 Nov 2023, when the new stage 2024 was enabled. Switching back to stage 2023 on 10 Nov reproduced the original runtime:

plot_phys

Additionally, the new stage causes an issue in gpu_test sample.tab output:

Bildschirmfoto vom 2023-11-10 08-04-13

To Reproduce

Rerun the test case at: /p/fastdata/slmet/slmet111/model_data/mptrac/nightly_builds/juwels-booster/run.sh

Expected behavior

Need to further investigate this issue and find the root cause.

Maybe next to updating the software stage also other changes of the JUWELS Booster system config were introduced?

Trajectory calculations for the boundary layer

Is your feature request related to a problem?

  • MPTRAC may have limited capabilities for calculating trajectories in the boundary layer due to using pressure as vertical coordinate. Meteo data are converted from model level to pressure levels for usage with MPTRAC, which is causing interpolation errors. The pressure coordinate is not terrain-following.
  • More recently, Jan added the option to calculate diabatic trajectories, which provides the option to use meteo data on model levels rather than pressure levels

Describe the solution you'd like

  • Conduct a comparison of trajectory calculations for the boundary layer (model level versus pressure level meteo data) so that we can get an idea of how large the deviations are.
  • Conduct a comparison with HYSPLIT online trajectories (or other models such as FLEXPART or CLaMS) as a reference.
  • Assess what would be needed to make MPTRAC suitable for boundary layer applications.

definition of pressure level sets

Is your feature request related to a problem?

  • Check the definition of pressure levels sets in level_definitions() in libtrac.c.

Describe the solution you'd like

  • Add additional pressure levels between 1013.25 and 1044 hPa to achieve finer grid spacing. This might be critical for proper calculation of meteo variable that depend on the near-surface data (e.g. CAPE or PBL).
  • Compare selected level with previous definition used for CDO ml2pl operator to convert ECMWF data from model levels to pressure levels.

Reference

../build.sh nc4 error always

when I change ubuntu version to Ubuntu 20.04.4 LTS by windows10
after ./build.sh nc4 , there are also some problems:
***** 12 FAILURES! *****
Command exited with non-zero status 1
0.37user 0.20system 0:00.61elapsed 94%CPU (0avgtext+0avgdata 4200maxresident)k
0inputs+0outputs (0major+18348minor)pagefaults 0swaps
make[4]: *** [Makefile:3451: dt_arith.chkexe_] Error 1
make[4]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
make[3]: *** [Makefile:3437: build-check-s] Error 2
make[3]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
make[2]: *** [Makefile:3431: test] Error 2
make[2]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
make[1]: *** [Makefile:3208: check-am] Error 2
make[1]: Leaving directory '/usr/local/meteo/mptrac/libs/build/src/hdf5-1.12.1/test'
make: *** [Makefile:715: check-recursive] Error 1
image

implementation of asynchronous file-I/O

Is your feature request related to a problem?

  • a significant amount of runtime of the simulations is required for reading meteo data (in particular for ERA5 data)

Describe the solution you'd like

  • implement asynchronous file-I/O so that meteo data for the next time interval are being read ahead of time while trajectories calculations for the current time interval are conducted in parallel

Describe alternatives you've considered

  • compression of meteo data to reduce waiting time for file-I/O

Additional context

  • Example of runtime measurements of a recent GPU run on JUWELS Booster with 100 million particles and using ERA5 data (packed binary):
TIMER_READ_MET_BIN = 35.306 s    (min= 1.09297 s, mean= 1.41225 s, max= 2.2265 s, n= 25)
TIMER_GROUP_INIT = 0.553 s
TIMER_GROUP_MEMORY = 19.416 s
TIMER_GROUP_INPUT = 36.961 s
TIMER_GROUP_PHYSICS = 89.866 s
TIMER_GROUP_OUTPUT = 32.075 s
TIMER_TOTAL = 178.871 s
  • Reading the ERA5 requires about 20% of the total runtime.

Restructure MPTRAC to serve as a library.

MPTRAC routines could be made available to other Models (CLaMS) more easily if MPTRAC is properly organized into a library containing those routines (libtrac.h/libtrac.c), and a time loop (trac.c).

  • Move routine declarations and macros etc. to libtrac.h, from trac.c
  • Eventually, rename and reorder the routines.
  • Create high-level functions for the initialisation, running and finalization of the library tools.
  • Change the compile process if needed ...
  • Test library in a set-up separate from MPTRAC...

Adapt mixing module for usage on GPUs

Is your feature request related to a problem?

  • The new mixing module in trac.c is limited to operating on CPUs. For GPU runs, particle data will be transferred form GPU to CPU memory, mixing will be calculated on the CPUs, and data will be transferred back to GPUs.

Describe the solution you'd like

  • Mixing should be calculated directly on the GPUs, avoiding the data transfers between CPU and GPU memory.

Additional context

  • The code might be rewritten for GPUs, following the example of module_sort().

EasyBuild installation of MPTRAC

Is your feature request related to a problem?

  • Users might have difficulties compiling MPTRAC before they can run it on Juelich HPC systems.

Describe the solution you'd like

  • Provide an EasyBuild installation of MPTRAC on JURECA and JUWELS so that the model can be used by loading it as a module.
  • Check back with Sabine about EasyBuild configuration.
  • Installation in $USERSOFTWARE of meteocloud data project.

Describe alternatives you've considered

  • Prepare binaries with static compilation that can be copied to/from different machines.

Additional context

  • Think about proper solutions for the automatic deployment of code?

bug in ERA5 polar trajectories?

  • The CLaMS group identified a bug in polar trajectories using ERA5 data. Air parcels sometimes stick to the pole (see plot below):

compare_traj

  • The issue seems to be related to a new interpolation scheme of the meteo data on MARS implemented in 2019.

  • More information can be found in the meteocloud issue tracker: https://jugit.fz-juelich.de/slcs/meteocloud/-/issues/66

  • Identify and further investigate a test case for MPTRAC

  • Add a fix to the meteo data during meteo data preprocessing?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.