Code Monkey home page Code Monkey logo

mom6-examples's Introduction

MOM6 source repo MOM6 documentation MOM6 coverage
Regression Read The Docs Status codecov

MOM6-examples

This repository provides the configurations (input parameters and data) and their corresponding regression data (for testing), of models that involve MOM6 and SIS2. The repository also contains tools for analysis and preprocessing.

Where to find information

To find information, start at the MOM6-examples wiki.

For general inquiries about using MOM6 and affiliated models, use the CESM MOM6 forums.

Requests for help and other issues associated with the tools or configurations should be registered at MOM6-examples issues.

Issues specific to the MOM6 source code should be registered at MOM6 issues.

Issues specific to the SIS2 source code should be registered at SIS2 issues.

What files are what

The top level directory structure groups source code and input files as follow:

File/directory Purpose
LICENSE.md a copy of the Gnu lesser general public license, version 3.
README.md this file with basic pointers to more information.
src/ source code for MOM6, SIS2 and FMS-shared code.
tools/ tools for working with MOM6 (not source code and not necessarily supported).
ocean_only experiments that just use MOM6
ice_ocean_SIS experiments that use MOM6 and SIS code in coupled mode
ice_ocean_SIS2 experiments that use MOM6 and SIS2 code in coupled mode
coupled_AM2_LM2_SIS experiments that use MOM6, SIS, LM2 and AM2 code, ie. fully coupled
coupled_AM2_LM3_SIS/ experiments that use MOM6, SIS, LM3 and AM2 code, ie. fully coupled
coupled_AM2_LM3_SIS2/ experiments that use MOM6, SIS2, ML3 and AM2 code, ie. fully coupled
Directory Purpose
src/MOM6/ is a git submodule that contains the source code for MOM6
src/SIS2/ is a git submodule that contains the source code for SIS2
src/FMS/ is a git submodule that contains the source code for FMS

Policies

The repository policies (repository access, branches, procedures, ...) are the same as the MOM6 source code policies.

Disclaimer

The United States Department of Commerce (DOC) GitHub project code is provided on an ‘as is’ basis and the user assumes responsibility for its use. DOC has relinquished control of the information and no longer has responsibility to protect the integrity, confidentiality, or availability of the information. Any claims against the Department of Commerce stemming from the use of its GitHub project will be governed by all applicable Federal law. Any reference to specific commercial products, processes, or services by service mark, trademark, manufacturer, or otherwise, does not constitute or imply their endorsement, recommendation or favoring by the Department of Commerce. The Department of Commerce seal and logo, or the seal and logo of a DOC bureau, shall not be used in any manner to imply endorsement of any commercial product or activity by DOC or the United States Government.

This project code is made available through GitHub but is managed by NOAA-GFDL at https://gitlab.gfdl.noaa.gov.

mom6-examples's People

Contributors

adcroft avatar aekiss avatar angus-g avatar breichl avatar elizabethyankovsky avatar gustavo-marques avatar hallberg-noaa avatar jkrasting avatar kshedstrom avatar marshallward avatar mjharrison-gfdl avatar mom6bot avatar nichannah avatar nikizadehgfdl avatar raphaeldussin avatar stephengriffies avatar suyashbire1 avatar tahvildarzadeh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mom6-examples's Issues

BOUND_SALINITY = True in OM4_025

We presently have BOUND_SALINITY = True in OM4_025, and thus in CM4. However, SIS2 can handle salinity = 0. We should therefore set BOUND_SALINITY = False in order to accept arbitrarily fresh waters, which may occur particularly when using fine resolution simulations.

salt_flux diagnostic identically zero in CM4

The diagnostic field "salt_flux" is identically zero in the CM4 simulations. This field should be nonzero, as it represents the transfer of salt between the liquid ocean and solid sea ice. There is either a problem with the diagnostic or a problem with the prognostic model. This issue should be high priority to debug, as the bug could cause major problems in sea ice regions if there is no salt exchanged between liquid and solid.

New version of FMS submodule

The version of the submodule src/FMS pointed to in MOM6-examples has been updated (in commit 0c9141c).

  • This does not affect production/XML users.
  • Build instructions (for developers) are unchanged.
  • To update do cd MOM6-examples; git submodule update src/FMS.
  • You will need to do a complete recompile, i.e. wipe all your objects files/libraries.

The changes in FMS include various fixes required for some new model configurations under development which will be added to MOM6-examples shortly. It is not required to update src/FMS until we are utilizing the fixes but you'll have to update eventually.

Make src/coupler into a submodule

The current HEAD of http://gitlab.gfdl.noaa.gov/fms/coupler.git is not compatible with the version of FMS used by NOAA-GFDL-MOM6-examples. This lead to a compile error:

"""
call diag_manager_init(DIAG_MODEL_SUBSET=diag_model_subset, TIME_INIT=date) ! initialize diag_manager for processor sub
1
Error: Keyword argument 'time_init' at (1) is not in the procedure
"""

Perhaps src/coupler should be made into a submodule?

Update the coupler section of https://github.com/CommerceGov/NOAA-GFDL-MOM6-examples/wiki/Obtaining-other-components-for-coupled-models

When this is fixed/closed.

Install data and directories for OM4_033 and OM4_05 configurations

We've been bootstrapping the 1/3 and 1/2 degree models off of the 1/4 configuration but has meant

  1. the parameters have been somewhat inseparable;
  2. automated analysis (frepp) can't work.

To decouple these new configurations from OM4_025 we need to:

  1. install appropriate input and analysis data in /archive and /pdata;
  2. add specific directories in ice_ocean_SIS2 for the 0M4_033 and OM4_05 configurations;
  3. install appropriate mask_tables;
  4. generate ocean.stats for these new configurations;
  5. adjust the parameters appropriately for these resolutions.

global_ALE tests have no vertical mixing scheme enabled

It appears that the global_ALE test case does not use a vertical mixing scheme. The KPP scheme is run with PASSIVE=true and the nonlocal term is not used. I do not see other boundary layer schemes used. I list this as an issue since I believe we originally thought these tests used KPP, when in fact that scheme is not impacting the the prognostic fields at all.

ocean analysis scripts do not work for 1/2 degree models

The ocean analysis scripts were made for 1/4 degree ocean models and they are not working for 1/2 degree models.
E.g., see

/home/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/gfdl.ncrc3-intel15-prod-openmp/scripts/analysis/ocn_monthly.frepp.0001-0005.printout


/nbhome/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/mom6/tools/analysis/MLD_003.py -g /archive/gold/datasets/OM4_025/mosaic.v20140610.unpacked -o /nbhome/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/ocean_0001-0005/Hosoda_MLD -l 0001-0005 /archive/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/gfdl.ncrc3-intel15-prod-openmp/pp/ocean_monthly/ts/monthly/5yr//ocean_monthly.000101-000512.MLD_003.nc
/nbhome/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/mom6/tools/analysis/m6plot.py:561: VisibleDeprecationWarning: boolean index did not match indexed array along dimension 0; dimension is 1080 but corresponding boolean dimension is 576
  if not numpy.ma.getmask(s).any()==numpy.ma.nomask: weight[s.mask] = 0.
Traceback (most recent call last):
  File "/nbhome/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/mom6/tools/analysis/MLD_003.py", line 56, in <module>
    save=cmdLineArgs.outdir+'/MLD_003_minimum.png')
  File "/nbhome/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/mom6/tools/analysis/m6plot.py", line 67, in xyplot
    sMin, sMax, sMean, sStd, sRMS = myStats(maskedField, area, debug=debug)
  File "/nbhome/Amy.Langenhorst/ulm_201510_awg_v20160212_mom6_2016.03.22/CM4_c96L32_am4g9_fullaero_2000_t1_OMp5_lmix_H5_nmle_ndiff_meke/mom6/tools/analysis/m6plot.py", line 566, in myStats
    mean = numpy.ma.sum(weight*s)/sumArea
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/numpy-1.11.0b3-py2.7-linux-x86_64.egg/numpy/ma/core.py", line 3945, in __rmul__
    return multiply(other, self)
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/numpy-1.11.0b3-py2.7-linux-x86_64.egg/numpy/ma/core.py", line 966, in __call__
    result = self.f(da, db, *args, **kwargs)
ValueError: operands could not be broadcast together with shapes (1080,1440) (576,720)

Reorganizing single-column experiments

This is a heads up that we are about to move some experiments around within MOM6-examples.
Any testing scripts you use will have to be adapted once we've committed the reorganization.

  • single_column and single_column_z are being merged such that
    • ocean_only/single_column/ -> ocean_only/single_column/BML/
    • ocean_only/single_column_z/ -> ocean_only/single_column/KPP/
  • SCM_KPP_tests is being renamed as are the sub-directories
    • ocean_only/SCM_KPP_tests/ -> ocean_only/CVmix_SCM_tests/
    • ocean_only/SCM_KPP_tests/WSwPSBF.A/ -> ocean_only/CVmix_SCM_tests/wind_only/KPP/
    • ocean_only/SCM_KPP_tests/WSwPSBF.B/ -> ocean_only/CVmix_SCM_tests/skin_warming_wind/KPP/
    • ocean_only/SCM_KPP_tests/WSwPSBF.C/ is being deleted
    • We will be adding a host of new tests under ocean_only/CVmix_SCM_tests/

After the reorganization we will, in later commits, be:

  • adding a BML/ test under each of the CVmix_SCM_tests;
  • extending CVmix_SCM_tests to include mech_only, skin_warming_mech, sw_warming_only, cooling_only, evap_only, precip_only, precip_wind, and probably more;
  • adding a third boundary layer scheme (IEBL) under all the above locations.

Heads up: update all other components to "ulm"

This is notice that we need everyone to switch to using "ulm" code for the non-MOM6/SIS2 components of coupled and ice-ocean models. Until this point, everything has worked with patched "tikal" code but shortly we will be making some changes that depend on "ulm". MOM6-examples was switched to "ulm" in commit 0c08185.

@nikizadehgfdl already uses Ulm in the XMLs so XML users do not need to do anything. This only affects those who setup their working directories by hand.

The wiki page for Obtaining other components has been updated accordingly.

If you followed the setup described in the wiki then you need to either cvs update -r ulm in CVS directories or git pull in the git repos. A quick and easy way to update is to completely remove the source directories and re-download them with the "ulm" tag. However, it is best to not delete directories you edit and commit code from so do not cut and paste if you have altered code in these directories.

If you are comfortable deleting everything then, from above MOM6-examples, do

mkdir _delete_
mv MOM6-examples/src/{AM2,atmos_null,coupler,ice_param,land_null,LM3,SIS} _delete_

followed by

(cd MOM6-examples/src; git clone [email protected]:fms/coupler.git)
(cd MOM6-examples/src; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P ice_param)
(cd MOM6-examples/src; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P atmos_null)
(cd MOM6-examples/src/atmos_null; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P atmos_param/diag_integral atmos_param/monin_obukhov)
(cd MOM6-examples/src; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P land_null)
(cd MOM6-examples/src; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P -d SIS ice_sis)
mkdir -p MOM6-examples/src/LM3
(cd MOM6-examples/src/LM3; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P land_lad2 land_param)
find MOM6-examples/src/LM3/land_lad2 -type f -name \*.F90 -exec cpp -Duse_libMPI -Duse_netCDF -DSPMD -Duse_LARGEFILE -C -v -I MOM6-examples/src/FMS/include -o '{}'.cpp {} \;
find MOM6-examples/src/LM3/land_lad2 -type f -name \*.F90.cpp -exec rename .F90.cpp .f90 {} \;
find MOM6-examples/src/LM3/land_lad2 -type f -name \*.F90 -exec rename .F90 .F90_preCPP {} \;
mkdir -p MOM6-examples/src/AM2
(cd MOM6-examples/src/AM2; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -kk -r ulm -P atmos_coupled atmos_fv_dynamics atmos_param_am3 atmos_shared)
rm -rf MOM6-examples/src/AM2/atmos_fv_dynamics/driver/solo
mkdir -p MOM6-examples/build
(cd MOM6-examples/build; cvs -d :ext:cvs.princeton.rdhpcs.noaa.gov:/home/fms/cvs co -r fre-commands-bronx-7 -P -d site fre/fre-commands/site)
mkdir -p MOM6-examples/build

Obviously you will need to rebuild everything after that.

pp analysis scripts fail because some diagnostics are removed from diag_table.MOM6

area_t was removed from ice_ocean_SIS2/OM4_025/diag_table.MOM6 in commit 221f0b4
Some analysis scripts depend on this variable and are failing.

Traceback (most recent call last):
  File "Krasting-SeaIce.py", line 22, in <module>
    yh = fs('area_t').getAxis(0)
  File "/nbhome/jpk/uvcdat-1.5/install/lib/python2.7/site-packages/cdms2/cudsinterface.py", line 44, in __call__
    raise CDMSError, "No such variable, " + id
cdms2.error.CDMSError: No such variable, area_t

Also, ssh cannot be found in ocean_daily.nc

 /nbhome/Niki.Zadeh/ulm_201505_intel14land_awg_v20150923_mom6sis2_2015.10.13/CM4_c96L32_am4g6_2000_sis2_lowmix_intel14/mom6/tools/analysis/calc_variance.py ssh 00010101.ocean_daily.nc /vftmp/Niki.Zadeh/pbs9188005/ulm_201505_intel14land_awg_v20150923_mom6sis2_2015.10.13/CM4_c96L32_am4g6_2000_sis2_lowmix_intel14/gfdl.ncrc2-intel14-prod-openmp/CM4_c96L32_am4g6_2000_sis2_lowmix_intel14_00010101/history_refineDiag/00010101.nc/00010101.ocean_month_refined.nc
Traceback (most recent call last):
  File "/nbhome/Niki.Zadeh/ulm_201505_intel14land_awg_v20150923_mom6sis2_2015.10.13/CM4_c96L32_am4g6_2000_sis2_lowmix_intel14/mom6/tools/analysis/calc_variance.py", line 18, in <module>
    if args.variable not in nc_in.variables: raise Exception('Could not find %s in file "%s"'%(args.variable,args.daily_file))

double_gyre: xh and xq not being written correctly to prog__0001_006.nc

I ran the double_gyre example and all the variables in prog__001_006.nc seem appropriate except xh and xq.

ncdump -v xh prog_001_006.nc gives:
xh = 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 21.75 ;

ncdump -v xq prog_001_006.nc gives:
xq = 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 22 ;

If I run the experiment with a cartesian grid instead of spherical, this issue does not arise. I was able to recreate this behavior on two different machines. I would be glad to share more information if required.

Move logging of layout parameters into new file MOM_parameter_doc.layout

Layout parameters (LAYOUT, IO_LAYOUT, NIPROC, NJPROC, MASKTABLE, SYMMETRIC_MEMORY, etc.) are currently logged in MOM_parameter_doc.all . Good idea but it means when we test with alternative layouts we get false positives when checking for diffs in committed files. This happens a lot when @Hallberg-NOAA needs to check in actual parameter changes for which he uses different layout to @adcroft-gfdl. This has led to repeated resets such as 4059e8e, 20ffb3c, etc...). I've been happy to do the resetting because it helps keep me at the top of the contributers list.

The proposal is to redirect layout parameters to MOM_parameter_doc.layout which would be committed for informational purposes but should never have to be changed thereafter, and does not matter for checksums/answers.

Logging to stdout would remain unchanged.

Need missing mask_table files added to OM4_025/INPUT

The CM4_SIS2 XML refers to mask_tables in ice_ocean_SIS/OM4_025/INPUT that do not exist in the repository. The logic also only refers to one of them (3785) so the logic also needs to be updated once these files are added.

ERROR: A problem with the data file: /ncrc/home1/Alistair.Adcroft/ulm_plus_LowSens_Drag_mom6_2015.01.20/cm4_sis2_compile/src/mom6/ice_ocean_SIS/OM4_025/INPUT/mask_table.3785.90x144
ERROR: A problem with the data file: /ncrc/home1/Alistair.Adcroft/ulm_plus_LowSens_Drag_mom6_2015.01.20/cm4_sis2_compile/src/mom6/ice_ocean_SIS/OM4_025/INPUT/mask_table.1402.72x72
ERROR: A problem with the data file: /ncrc/home1/Alistair.Adcroft/ulm_plus_LowSens_Drag_mom6_2015.01.20/cm4_sis2_compile/src/mom6/ice_ocean_SIS/OM4_025/INPUT/mask_table.622.36x72

FMS commit does not exist in the repository

% git submodule update FMS
fatal: reference is not a tree: 65627cea0e2d939e7599be136ea3c9028a2c0190
Unable to checkout '65627cea0e2d939e7599be136ea3c9028a2c0190' in submodule path 'FMS'

This is a result of commit 7abab01 in MOM6-examples in which we refer to a commit in a forked FMS? This only works if we also change the URL in .gitmodules.

As a rule, we do not want to be changing those URLs and referencing temporary forks so we need to expedite getting that patch added to the master FMS directory.

In the meantime, a workaround is to temporarily use a different remote

cd src/FMS
git remote add temp [email protected]:Zhi-Liang/FMS.git
git fetch temp
git pull temp dev/master

Uninitialised variables being used in MOM_tracer_advect.F90

The variables domore_v, domore_u are being read before being intialised in subroutine advect_tracer(). Running the benchmark test under Valgrind gives errors like:

==11103== Conditional jump or move depends on uninitialised value(s)
==11103== at 0x976DF4: mom_tracer_advect_MOD_advect_tracer (MOM_tracer_advect.F90:293)
==11103== by 0xF0306B: __mom_MOD_step_mom (MOM.F90:1076)
==11103== by 0xA1A1FE: MAIN
(MOM_driver.F90:396)
==11103== by 0x1B33249: main (in /short/v45/nah599/more_home/mom6/NOAA-GFDL-MOM6-examples/build/gnu/ocean_only/debug/MOM6)

...

Pull request on the way.

mkmf is now a submodule (with a new relative path)

We have just added mkmf as a sub-module under MOM6-examples/src/.

Repercussions in brief:

  1. Paths to list_paths and mkmf scripts will differ (if you choose to use this mkmf).
  2. Answers (recorded in ocean.stats) will now change as the mkmf sub-module is updated.

Rationale and comments:

  1. The compiler options are no less source code than fortran source and thus need to be recorded with the regression answers (ocean.stats) for a more complete version control/history. Until now we've relied on the wiki pages to record these options.
  2. In the compilation instructions in the MOM6-examples wiki we used to install mkmf under a user created directory build/. Because build/ is not currently in the repository and is part of the user's choice in their own workflow we had a choice of adding mkmf to either tools/ or src/. Following @raphaeldussin's lead in ESMG-configs we opted for src/ because the compiler options are source and mkmf is more than just a tool in this regard.
  3. The old approach of cloning mkmf into build/ will still work but, for GFDLers, you might be using the wrong compiler options needed to obtain the GFDL answers.
  4. If you pull the latest MOM6-examples, to get the new submodule issue:
cd MOM6-examples/
git submodule init src/mkmf
git submodule update src/mkmf

Thereafter a simple git submodule update will update mkmf along with all the other submodules.

cannot compile with PGI and -openmp

ftn -D_USE_GENERIC_TRACER -DINTERNAL_FILE_NML -D_FILE_VERSION="`git-version-string /lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/src/mom6/src/MOM6/src/core/MOM_PressureForce_analytic_FV.F90`" -I/opt/cray/netcdf/4.2.0/netcdf-pgi/include -I/opt/cray/netcdf/4.2.0/netcdf-pgi/include -i4 -r8 -byteswapio -Mcray=pointer -Mcray=pointer -Mflushz -Mdaz -D_F2000 -O2 -nofma -mp -I/lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/ncrc2.pgi-repro-openmp/exec/ocean_shared -I/lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/ncrc2.pgi-repro-openmp/exec/fms  -c -I/lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/src/mom6/src/MOM6/config_src/dynamic -I/lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/src/mom6/src/MOM6/src/framework    /lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/src/mom6/src/MOM6/src/core/MOM_PressureForce_analytic_FV.F90

PGF90-S-0155-i must appear in a SHARED or PRIVATE clause (/lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/src/mom6/src/MOM6/src/core/MOM_PressureForce_analytic_FV.F90: 708)
PGF90-S-0155-j must appear in a SHARED or PRIVATE clause (/lustre/f1/Niki.Zadeh/testing_20141118/FMS_compile_libs/src/mom6/src/MOM6/src/core/MOM_PressureForce_analytic_FV.F90: 708)

poleward_heat_transport.py and TS_depth_integrals.py fail for 1/2 degree models

From Bill's experiment
~Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/gfdl.ncrc3-intel15-prod-openmp/scripts/analysis/ocn_annual.frepp.0001-0005.printout

/nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/mom6/tools/analysis/poleward_heat_transport.py -g /archive/gold/datasets/OM4_05/mosaic.v20151203.unpacked -o /nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/ocean_0001-0005/heat_transport -l 0001-0005 /archive/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/gfdl.ncrc3-intel15-prod-openmp/pp/ocean_annual/av/annual_5yr//ocean_annual.0001-0005.ann.nc
/nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/mom6/tools/analysis/poleward_heat_transport.py:66: UserWarning: Diffusive temperature term not found. This will result in an underestimation of the heat transport.
  warnings.warn('Diffusive temperature term not found. This will result in an underestimation of the heat transport.')
/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/collections.py:548: FutureWarning: elementwise comparison failed; returning scalar instead, but in the future will perform elementwise comparison
  if self._edgecolors == 'face':
/nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/mom6/tools/analysis/poleward_heat_transport.py:74: FutureWarning: comparison to `None` will result in an elementwise object comparison in the future.
  if vmask != None: HT = HT*vmask
mkdir -p /nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/ocean_0001-0005/heat_salt_0_300m

/nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/mom6/tools/analysis/TS_depth_integrals.py -r /archive/gold/datasets/OM4_05/obs/WOA05_ptemp_salt_annual.v2015.12.03.nc -s 0 -e 300 -g /archive/gold/datasets/OM4_05/mosaic.v20151203.unpacked -o /nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/ocean_0001-0005/heat_salt_0_300m -l 0001-0005 /archive/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/gfdl.ncrc3-intel15-prod-openmp/pp/ocean_annual/av/annual_5yr//ocean_annual.0001-0005.ann.nc
 Variable named temp does not exist in file named /archive/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/gfdl.ncrc3-intel15-prod-openmp/pp/ocean_annual/av/annual_5yr//ocean_annual.0001-0005.ann.nc
         Aborting ...
Traceback (most recent call last):
  File "/nbhome/Bill.Hurlin/ulm_201510_awg_v20160406_mom6_2016.03.22/CM4_c96L32_am4g9_2000_fullaero_sedi_t1_OMp5_lmix_H5_nmle_ndiff_meke.2016.03.22/mom6/tools/analysis/TS_depth_integrals.py", line 41, in <module>

remove vprec from CM4 diag table

We are presently saving vprec (virtual precip associated with SSS restoring) in CM4. This term is, correctly, zero for the CM4 simulations. We should remove vprec from the diag table to save archive space, and remove an array that is identically zero for the coupled simulations.

Isolating the Coriolis term in momentum budget

I am trying to carry out the thickness weighted averaged momentum budget as derived in Young(2012). It requires separate advection and Coriolis terms. So I was wondering if it is a good idea to isolate the Coriolis term from CAu, gKEu and rvxv.

fv = CAu - gKEu - rvxv
fvbrute = f*(v(i,j) + v(i+1,j) + v(i,j+1) + v(i+1,j+1))/4

These two methods give slightly different results. Is this supposed to happen?

fv
fu

These figures show the zonal variation of aforementioned terms at a fixed y (meridional midpoint of the domain) and a fixed zl away from the surface. We're using the Sadourny energy conserving scheme.

Fix the EddyKineticEnergy.py analysis script to work with 1 deg model

The script is broken for 1 degree model because it is fed the 1/4 degree gridSpec.

I can fix it to use the diagnostics file ocean_scalar.nc which is available in the workDir rather than fetching the original gridSpec file (which would need to have multiple scripts one for each resolution.)

combine the refineDiag scripts under tools/analysis into one

Due to a limitation of moab it is safer to have at most one refineDiag script in the xml.

The limitation is as follows: FRE adds the pathnames of all refineDiag scripts to some environment variable (on top of some other variables). That ENV then get prepended to all the commands that appear in the gfdl platform section. The whole thing is then dispatched by moab (on gaea) to run on PAN (at gfdl) as the pp.starter job.
BUT moab has a hard set limit for the length of that ENV and if it is longer it is just cut (most likely in the middle of a command string). Hence the commands that we put in the section of gfdl platform could (and did) become useless and lead to PP errors.

We have run into this limitation in our CM4 runs.

Since the refineDiag scripts pathnames are part of this ENV it is advisable to have at most one refineDiag to save something like 130 chars and avoid these situation as much as possible. A band-aid, I know, but should we rather wait for moab/FRE to fix the issue? That would take a few months at least.

rethink use of /nbhome for cloning mom6

/nbhome/$USER is limited to 10GB and can easily fill up and when that happens the analysis scripts will fail.
I already went over quota on /nbhome because of the figures and the following items even though I do not have too many files there. I imagine this is going to be a problem for real model runners.

  • Some analysis tool try to build (MIDAS) libraries in that space.
  • Some analysis scripts (TS_depth_integrals.py) save larger .nc files in analysis figure destination in addition to small .png.

Can we try /ptmp instead?

spurious river runoff in OM4

The river runoff used in this test case is spurious. It is being interpolated at runtime from GOLD-SIS 1degree ruoff on tripolar grid. This is wrong.

--we cannot use the regrid online to get rivers remapped. Instead, we need to do the regridding offline, before the simulation.
--we need to do the river regrid from the raw NCAR Dai/Trenberth data, using river regrid onto the OM4 grid.

We can make a 12-month climatology using the IAF regridded rivers now being used by Amy Langenhorst in her OM4-IAF simulations. She did things properly. We need to provide a well defined sequence of steps to clean up this process of getting the river data onto OM4.

Datasets for examples

The "Getting Started" wiki currently only gives instructions for downloading the input datasets from Gaea. I'm trying to set up a new experiment loosely based on the MESO example, but I have little previous experience with GOLD or MOM6. I was hoping I could use the MESO inputs as a template to help me figure out how to format these files.

Is there any way to download these datasets from outside of GFDL?

Christopher Pitt Wolfe

Do we need layer_coord.nc in OM4 tests

In OM4 tests, we have the following settings in MOM_input:

COORD_CONFIG = "file"
COORD_FILE = "layer_coord.nc"
COORD_VAR = "Layer"

However, the model uses ALE. It is not a layered isopycnal model. What function does these settings for COOR parameters have in OM4? Can these settings be removed if they serve no function? They are confusing.

ideal age tracer

We should consider a new way to force the ideal age tracer in MOM6/src/tracer/ideal_age_example.F90. From my read of the code, it sets the surface value to 0.0. This approach makes us prone to sensitivity of vertical resolution. Consider the extreme case of 1cm upper grid cell, in which case volume of zero age water is a tiny fraction of what we would have if the upper cell is 10m. I suggest we consider using a damping boundary condition, damping the tracer back to zero, which is how NEMO does ideal age.

Revert C_P to default values in most experiments

The default value of C_P is some number with many digits consistent with TEOS-10 BUT we are using a different (roundish) number in almost all experiments.

  • Apart from some legacy experiments there is no obvious reason to continue using the round value and should start using the default value.
  • C_P is shown as non-default in almost all experiments. Switching to the default value would shorten the list of non-default parameters.
  • The change should be evaluated in OM4_025 and 1-degree equivalent.

fix executable permission issue of python analysis scripts for post-processing

When an experiment starts on gaea the whole mom6/ directory (minus .datasets) is g-copied to gfdl /nbhome/$USER for PP.
But gcp does not preserve the (x-) permissions of the files. Hence when analysis scripts are called we get a permissions error like this:

/nbhome/Bonnie.Samuels/ulm_vanilla_dev_awg_mom6_2015.04.20/CM4_c96L48_am4f4_2000/mom6/tools/analysis/EddyKineticEnergy.py -g 00050101.ocean_static.nc -o /nbhome/Bonnie.Samuels/ulm_vanilla_dev_awg_mom6_2015.04.20/CM4_c96L48_am4f4_2000/refineDiag_ocean_annual/ocean_00050101/EddyKineticEnergy -l 00050101 00050101.ocean_daily.nc
/nbhome/Bonnie.Samuels/ulm_vanilla_dev_awg_mom6_2015.04.20/CM4_c96L48_am4f4_2000/mom6/tools/analysis/EddyKineticEnergy.py: Permission denied.

I suggest to add a "python " keyword before calling a python script in any caller shell script to avoid this.
The list of existing caller scripts is:
mom6/tools/analysis/MOM6_refineDiag.csh
mom6/ice_ocean_SIS/OM4_025/ocn_annual.frepp
mom6/ice_ocean_SIS/OM4_025/ocn_annual_z.frepp
mom6/ice_ocean_SIS/OM4_025/ocn_monthly.frepp

The workaround to this issue before it is solved is to do the following as part of the pp.starter (this is how the issue is currently avoided)
chmod -R +x mom6/

ocean_static.nc from diag_manager has holes in coordinates when using mask_table

Since we are running with mask_table most of the time the file produced by diag_manager has holes in coordinates (geolon,gelolat) most of the time and is useless to ferret or python analysis scripts that generate figures.

Should it be replaced by the corrected file that patches the holes before ending up in the archives/history ?

Global_ALE configurations do not reproduce across PE count

Some time since about October 10th, changes were made to the MOM6 or MOM6_examples code that are now causing the Global_ALE/z and Global_ALE/layer not to reproduce across processor count (e.g. 8x8=64 PEs and 10x6=60 PEs give different answers).

Updated build instructions for MOM6 ocean_only

A recent modification to MOM6 requires a slightly different compile procedure.

We now require some source code from the FMS/coupler directory that was previously filtered out at the list_paths step. We have thus updated the compile instructions on the wiki. If you have scripted the wiki commands you should look for the lines that filtered out files with egrep. Specifically, you should no longer use the line

mv path_names path_names.orig; egrep -v "coupler" path_names.orig > path_names;

After commit NOAA-GFDL/MOM6@3ad1d9e (August 5), if you see compiler errors mentioning the "ensemble_manager" then you need to remake you Makefiles following the revised instructions.

Need help with grid configuration

I have a setup with a spherical grid being forced by restoring buoyancy at the surface. This set up runs great.

For my future runs, because my focus is on the eastern boundary (EB), I want to set up the grid such that zonal resolution is high near EB and becomes coarse as we go towards the west. Not being familiar with mosaic, I decided to meddle around with MOM_grid_initialization.F90 to achieve this.

I have questions about dxC and dxB. In set_grid_metrics_spherical, dlon is constant throughout. I have a varying dlon, which is used to set up B points. I compute T points as simply the midpoints of successive B points. So, in this case, how do I calculate dxC and dxB? Should they be centered, forward or backward?

Here's the routine I modified from set_grid_metrics_spherical. I don't know if there is a better way to do this? May be mosaic? It would be great if someone more proficient with MOM6 could help me out here. Please excuse me if I'm asking for too much.

Suyash Bire
Stony Brook University

subroutine set_grid_metrics_spherical_nonuniformx(G, param_file)
  type(ocean_grid_type), intent(inout) :: G
  type(param_file_type), intent(in)    :: param_file
! Arguments:
!  (inout)   G - The ocean's grid structure.
!  (in)      param_file - A structure indicating the open file to parse for
!                         model parameter values.

!    Calculate the values of the metric terms that might be used
!  and save them in arrays.
!    Within this subroutine, the x- and y- grid spacings and their
!  inverses and the cell areas centered on h, q, u, and v points are
!  calculated, as are the geographic locations of each of these 4
!  sets of points.
  real :: PI, PI_180! PI = 3.1415926... as 4*atan(1)
  integer :: i, j, isd, ied, jsd, jed
  integer :: is, ie, js, je, Isq, Ieq, Jsq, Jeq, IsdB, IedB, JsdB, JedB
  integer :: i_offset, j_offset, N
  real :: grid_latT(G%jsd:G%jed), grid_latB(G%JsdB:G%JedB)
  real :: grid_lonT(G%isd:G%ied), grid_lonB(G%IsdB:G%IedB)
  real :: dLon,dLat,latitude,longitude,dL_di, dLonmin, dLonmax
  character(len=48)  :: mod  = "MOM_grid_init set_grid_metrics_spherical_nonuniformx"

  is = G%isc ; ie = G%iec ; js = G%jsc ; je = G%jec
  isd = G%isd ; ied = G%ied ; jsd = G%jsd ; jed = G%jed
  Isq = G%IscB ; Ieq = G%IecB ; Jsq = G%JscB ; Jeq = G%JecB
  IsdB = G%IsdB ; IedB = G%IedB ; JsdB = G%JsdB ; JedB = G%JedB
  i_offset = G%isd_global - isd; j_offset = G%jsd_global - jsd

  call callTree_enter("set_grid_metrics_spherical_nonuniformx(), MOM_grid_initialize.F90")

!    Calculate the values of the metric terms that might be used
!  and save them in arrays.
  PI = 4.0*atan(1.0); PI_180 = atan(1.0)/45.

  call get_param(param_file, mod, "AXIS_UNITS", G%axis_units, default="degrees")
  if (trim(G%axis_units) == "") G%axis_units = "degrees"
  if (trim(G%axis_units) .ne. "degrees") call MOM_error(FATAL, &
    "MOM_grid_init.F90, set_grid_metrics_simple_spherical_nonuniformx: "// &
    "axis_units must be degrees")
  call get_param(param_file, mod, "SOUTHLAT", G%south_lat, &
                 "The southern latitude of the domain.", units="degrees", &
                 fail_if_missing=.true.)
  call get_param(param_file, mod, "LENLAT", G%len_lat, &
                 "The latitudinal length of the domain.", units="degrees", &
                 fail_if_missing=.true.)
  call get_param(param_file, mod, "WESTLON", G%west_lon, &
                 "The western longitude of the domain.", units="degrees", &
                 default=0.0)
  call get_param(param_file, mod, "LENLON", G%len_lon, &
                 "The longitudinal length of the domain.", units="degrees", &
                 fail_if_missing=.true.)
  call get_param(param_file, mod, "RAD_EARTH", G%Rad_Earth, &
                 "The radius of the Earth.", units="m", default=6.378e6)

  call get_param(param_file, mod, "DLON_MIN", dLonmin, &
                 "The desired minimum grid size in x direction.", units="degrees", &
                 fail_if_missing=.true.)

  N = G%Domain%niglobal
! dLon = G%len_lon/G%Domain%niglobal
! dLonmin is the resolution near the eastern boundary and dLonmax is the resolution 
! near the western boundary. dLonmax has to be calculated based upon domain length,
! number of grid points and dLonmin.
  dLonmax = (PI*G%len_lon - 2.0*(N)*dLonmin)/((N)*(PI-2.0))
  dLat = G%len_lat/G%Domain%njglobal

  do j=G%JsgB,G%JegB
    latitude = G%south_lat + dLat*(REAL(J-(G%jsg-1)))
    G%gridLatB(J) = MIN(MAX(latitude,-90.),90.)
  enddo
  do j=G%jsg,G%jeg
    latitude = G%south_lat + dLat*(REAL(j-G%jsg)+0.5)
    G%gridLatT(j) = MIN(MAX(latitude,-90.),90.)
  enddo
   do i=G%IsgB,G%IegB
    dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
    if (i==G%IsgB) then
      G%gridLonB(I) = G%west_lon + dLon*(REAL(I-(G%isg-1)))
    else
      G%gridLonB(I) = G%gridLonB(I-1) + dLon
    endif
  enddo
  do i=G%isg,G%ieg
!    dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
    if (i==G%isg) then
      G%gridLonT(i) = (G%west_lon + G%gridLonB(I))/2.0
    else
      G%gridLonT(i) = (G%gridLonB(I) + G%gridLonB(I-1))/2.0
    endif
  enddo

  do J=JsdB,JedB
    latitude = G%south_lat + dLat* REAL(J+J_offset-(G%jsg-1))
    grid_LatB(J) = MIN(MAX(latitude,-90.),90.)
  enddo
  do j=jsd,jed
    latitude = G%south_lat + dLat*(REAL(j+J_offset-G%jsg)+0.5)
    grid_LatT(j) = MIN(MAX(latitude,-90.),90.)
  enddo
  do I=IsdB,IedB
    dlon = (dLonmin-dLonmax)*sin(I*PI/2.0/REAL(N)) + dLonmax
    if (i==IsdB) then
      grid_LonB(I) = G%west_lon + dLon*REAL(I+I_offset-(G%isg-1))
    else
      grid_LonB(I) = grid_LonB(I-1) + dLon
    endif
  enddo
  do i=isd,ied
!    dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
    if (i==isd) then
      grid_LonT(i) = (G%west_lon + grid_LonB(I))/2.0
    else
      grid_LonT(i) = (grid_LonB(i) + grid_LonB(i-1))/2.0
    endif
  enddo
!  dL_di = (G%len_lon * 4.0*atan(1.0)) / (180.0 * G%Domain%niglobal)
  do J=JsdB,JedB ; do I=IsdB,IedB
    G%geoLonBu(I,J) = grid_lonB(I)
    G%geoLatBu(I,J) = grid_latB(J)

    dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
! The following line is needed to reproduce the solution from
! set_grid_metrics_mercator when used to generate a simple spherical grid.
!   G%dxBu(I,J) = G%Rad_Earth * COS( G%geoLatBu(I,J)*PI_180 ) * dL_di
    G%dxBu(I,J) = G%Rad_Earth * dLon*PI_180 * COS( G%geoLatBu(I,J)*PI_180 )
    G%dyBu(I,J) = G%Rad_Earth * dLat*PI_180
    G%areaBu(I,J) = G%dxBu(I,J) * G%dyBu(I,J)
  enddo; enddo

  do J=JsdB,JedB ; do i=isd,ied
    G%geoLonCv(i,J) = grid_LonT(i)
    G%geoLatCv(i,J) = grid_latB(J)

     dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
! The following line is needed to reproduce the solution from
! set_grid_metrics_mercator when used to generate a simple spherical grid.
!   G%dxCv(i,J) = G%Rad_Earth * COS( G%geoLatCv(i,J)*PI_180 ) * dL_di
    G%dxCv(i,J) = G%Rad_Earth * (dLon*PI_180) * COS( G%geoLatCv(i,J)*PI_180 )
    G%dyCv(i,J) = G%Rad_Earth * dLat*PI_180
  enddo; enddo

  do j=jsd,jed ; do I=IsdB,IedB
    G%geoLonCu(I,j) = grid_lonB(I)
    G%geoLatCu(I,j) = grid_LatT(j)

    dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
! The following line is needed to reproduce the solution from
! set_grid_metrics_mercator when used to generate a simple spherical grid.
!   G%dxCu(I,j) = G%Rad_Earth * COS( G%geoLatCu(I,j)*PI_180 ) * dL_di
    G%dxCu(I,j) = G%Rad_Earth * dLon*PI_180 * COS( G%geoLatCu(i,J)*PI_180 )
    G%dyCu(I,j) = G%Rad_Earth * dLat*PI_180
  enddo; enddo

  do j=jsd,jed ; do i=isd,ied
    G%geoLonT(i,j) = grid_LonT(i)
    G%geoLatT(i,j) = grid_LatT(j)
    dlon = (dLonmin-dLonmax)*sin(i*PI/2.0/REAL(N)) + dLonmax
! The following line is needed to reproduce the solution from
! set_grid_metrics_mercator when used to generate a simple spherical grid.
!   G%dxT(i,j) = G%Rad_Earth * COS( G%geoLatT(i,j)*PI_180 ) * dL_di
    G%dxT(i,j) = G%Rad_Earth * dLon*PI_180 * COS( G%geoLatT(i,j)*PI_180 )
    G%dyT(i,j) = G%Rad_Earth * dLat*PI_180

!   latitude = G%geoLatCv(i,J)*PI_180             ! In radians
!   dL_di    = G%geoLatCv(i,max(jsd,J-1))*PI_180  ! In radians
!   G%areaT(i,j) = Rad_Earth**2*dLon*dLat*ABS(SIN(latitude)-SIN(dL_di))
    G%areaT(i,j) = G%dxT(i,j) * G%dyT(i,j)
  enddo; enddo

  call callTree_leave("set_grid_metrics_spherical_nonuniformx()")
end subroutine set_grid_metrics_spherical_nonuniformx

noisy age field in OM4

The ideal age field is noisy in OM4 test case. @adcroft knows about the problem, related to improper treatment of passive tracers in MOM6. Wish to register a ticket for this issue to be sure it is addressed.

Running tests and interpreting results towards establishing a baseline for regression testing

I've been successful at running a couple of test problems on two types of Amazon EC2 instances. I get consistent test results between the two instances. I can also obtain consistent results using the wiki instructions and the Makefile. For one problem, I get different results if I choose a different number of processors. The test case that I am currently working with is ocean_only/double_gyre. The results were initially different between the Makefile and the wiki instructions. For the wiki I chose 4 for number of processors and in the Makefile the default is 8 processors -- that generated different results. Updating the Makefile to 4 and re-running obtains identical results. From this point on, I will just refer to using the instructions on the wiki.

My goal is to establish some sort of baseline for regression testing. I'd also be interested in automatically testing the 2x2 matrix of static vs. dynamic and symmetric vs. non-symmetric. No matter what mode, we should theoretically arrive at the same result? Will there be a different baseline/checksum obtained for different compiler/cpu?

First, lets look at instance details & compilers. I've added a local Ubuntu system (home laptop).

A.
c4.large 2 vCPU 3.75Gb RAM
High frequency Intel Xeon E5-2666 v3 (Haswell) processors optimized specifically for EC2
Fedora Core 21
Compiler: gcc version 4.9.2 20150212 (Red Hat 4.9.2-6) (GCC)
OpenMPI: 1.8.3
Netcdf-fortran-openmpi: 4.2

B.
m4.xlarge 4 vCPU 16Gb RAM
2.4 GHz Intel Xeon® E5-2676 v3 (Haswell) processors
Fedora Core 23
Compiler: gcc version 5.3.1 20160406 (Red Hat 5.3.1-6) (GCC)
OpenMPI: 1.8.8
Netcdf-fortran-openmpi: 4.3.3.1

C.
Laptop 4 CPU 8 Gb RAM
Intel(R) Core(TM) i3-2310M CPU @ 2.10GHz
Ubuntu 14.04
Compiler: gcc version 4.8.4 (Ubuntu 4.8.4-2ubuntu1~14.04.1)
OpenMPI: 1.6.5
Netcdf: 4.1.3

I have carefully run tests with 4 and 8 processors. The results across A, B and C are consistent, but the results differ between 4 and 8 processors/threads. Is this reasonable to see a change between number of processors? Would this be considered our baseline for running this test?

4 processors:
(rm Depth_list.nc; . ./../../build/gnu/env ; mpirun -np 4 ../../build/gnu/ocean_only/repro/MOM6)

MOM Day 0.000 0: En 1.424331E-13, MaxCFL 0.00000, Mass 5.288178268008E+18 Total Energy: 4126FC775E10DF73 7.5321168372247962E+05 Total Mass: 5.2881782680077681E+18, Change: 0.0000000000000000E+00 Error: 0.00000E+00 ( 0.0E+00)
MOM Day 1.000 72: En 4.580432E-06, MaxCFL 0.00023, Mass 5.288178268008E+18 Total Energy: 42B607A80A851DA8 2.4222139843869656E+13 Total Mass: 5.2881782680077681E+18, Change: -1.7202096100332227E+01 Error: -1.72021E+01 (-3.3E-18) ....

8 processors:
(rm Depth_list.nc; . ./../../build/gnu/env ; mpirun -np 8 ../../build/gnu/ocean_only/repro/MOM6)

MOM Day 0.000 0: En 1.424361E-13, MaxCFL 0.00000, Mass 5.288178268008E+18 Total Energy: 4126FC96B8ADCFE7 7.5322736070108123E+05 Total Mass: 5.2881782680077681E+18, Change: 0.0000000000000000E+00 Error: 0.00000E+00 ( 0.0E+00)
MOM Day 1.000 72: En 4.580432E-06, MaxCFL 0.00023, Mass 5.288178268008E+18 Total Energy: 42B607A80A852EA4 2.4222139843886641E+13 Total Mass: 5.2881782680077681E+18, Change: -1.7202096100332227E+01 Error: -1.72021E+01 (-3.3E-18) ....

MIXEDLAYER_RESTRAT = True + BULKMIXEDLAYER = False.

The following is in OM4_025 MOM_parameter_doc.all

BULKMIXEDLAYER = False ! [Boolean] default = True

MIXEDLAYER_RESTRAT = True ! [Boolean] default = False
! If true, a density-gradient dependent re-stratifying
! flow is imposed in the mixed layer.
! This is only used if BULKMIXEDLAYER is true.

So are we using MIXEDLAYER_RESTRAT, even though BULKMIXEDLAYER = False?

If we are in fact using MIXEDLAYER_RESTRAT, then the comment in the code should be corrected. It is very confusing.

Relocation of MOM6/SIS2 repositories

The locations of the various MOM6 and SIS2 repositories will soon be moved from the GitHub organization "CommerceGov" to the GitHub organization "NOAA-GFDL". They will all still be on GitHub but at a new URL.

What's changing?

The four existing repositories will have new locations and names as follows:

  • CommerceGov/NOAA-GFDL-MOM6-examples -> NOAA-GFDL/MOM6-examples
  • CommerceGov/NOAA-GFDL-MOM6 -> NOAA-GFDL/MOM6
  • CommerceGov/NOAA-GFDL-SIS2 -> NOAA-GFDL/SIS2
  • CommerceGov/NOAA-GFDL-FMS -> NOAA-GFDL/FMS

Note that the repository names will be shorter as a result of the move.

In addition, the new locations will be publicly visible whereas the current repositories are private and only visible by invitation.

Why move?

The GitHub organization "CommerceGov" was setup my the CIO of the Department of Commerce (DOC) which allowed us to experiment with using GitHub for our code projects. Since then, DOC has formulated guidance that i) only requires DOC-wide projects to reside under "CommerceGov" and ii) allows NOAA/GFDL to collect and manage it's own repositories using it's own GitHub account "NOAA-GFDL". Keeping our repositories under the "CommerceGov" organization no longer makes sense, especially since we envision more repositories moving to GitHub.

Why public repositories?

  1. We always intended to switch to an open-source model. When to do so was mostly a matter of convenience.
  2. Making a repository visible does not constitute a public release - meaning a sanctioned and supported version of the codes. We are still in development mode and not ready to support a public release.
  3. The private aspect of our repositories has occasionally been an annoyance, for example, when trying to quickly share something with someone who is not on the access list.

How will this affect you?

  1. This move should not break anything. GitHub will re-direct all URLs for you so, as far as we understand things, everything should continue to work as if nothing had changed.
  2. Your forks will become public. If you have any commits on your forks that you would rather not have made public then you should make a local clone of your fork and then delete the fork on GitHub.
  3. Some boiler plate (README's with legal parts) will be updated.
  4. Once the move has taken place, we will send instructions on how to update your URLs, even though it is not technically necessary to update URLs.

When?

Sometime in the next few weeks.

a few issues with TS_depth_integrals.py in ocean_annual.frepp

There are the following issues with TS_depth_integrals.py in ocean_annual.frepp.

issues with ocean_annual.frepp

  1. MIDAS has to be built before the tool can be used. This can be done with "make local"

  2. module load gcc should be dropped, it'll break the MIDAS build!

  3. module load intel_compilers should be added

  4. There is no need for module load fre and module load fre-analysis

  5. PYTHONPATH has to be set before using the tool.

    issue with MIDAS/Makefile_GFDL

make local fails
This problem can be fixed by upgrading MIDAS submodule to the latest or minimally by:

-       (${PYTHONPATH:="./local"};python setup_complete.py install --home=$(INSTALL_PATH))
+       (python setup_complete.py install --home=$(INSTALL_PATH))

issues with TS_depth_integrals.py

After MIDAS is built the tool works interactively on analysis node but it fails when submitted as a job to pp nodes with:

Traceback (most recent call last):                                                                                                                                        
  File "/nbhome/Niki.Zadeh/testing_20141017/CM4_c96L48_am4a1r1_2000climo/mom6/tools/analysis/TS_depth_integrals.py", line 122, in <module>                                
    fig=plt.figure(1,figsize=(8.5,11))                                                                                                                                    
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/pyplot.py", line 423, in figure                            
    **kwargs)                                                                                                                                                             
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/backends/backend_qt4agg.py", line 31, in new_figure_manager
    return new_figure_manager_given_figure(num, thisFig)                                                                                                                  
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/backends/backend_qt4agg.py", line 38, in new_figure_manager_given_figure                                                                                                                                                             
    canvas = FigureCanvasQTAgg(figure)                                                                                                                                    
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/backends/backend_qt4agg.py", line 70, in __init__          
    FigureCanvasQT.__init__( self, figure )                                                                                                                               
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/backends/backend_qt4.py", line 207, in __init__            
    _create_qApp()                                                                                                                                                        
  File "/usr/local/python/2.7.3/lib/python2.7/site-packages/matplotlib-1.3.1-py2.7-linux-x86_64.egg/matplotlib/backends/backend_qt4.py", line 62, in _create_qApp         
    raise RuntimeError('Invalid DISPLAY variable')                                                                                                                        
RuntimeError: Invalid DISPLAY variable          

This can be cured by making sure that matplotlib to not use any Xwindows backend. This is achieved by

import matplotlib
matplotlib.use('Agg')

before

import matplotlib.pyplot as plt

Update submodule CVmix-src in MOM6

Heads up: the dev/master branch of MOM6 source code has been updated to use a newer version of CVmix. Use

cd src/MOM6/
git submodule update

to update the submodule when you next do a pull on the source code.

Pacific Equatorial Undercurrent defined wrong in diag_table

The OM4_025 MOM6 diag_table appears to specify a point rather than a section:

# Pacific Equatorial Undercurrent: general settings 
"ocean_model", "thetao", "thetao", "ocean_Pacific_undercurrent",  "all", "mean", "-155. -155. -3. -3. -1 -1",2
"ocean_model", "so",   "so",       "ocean_Pacific_undercurrent",  "all", "mean", "-155. -155. -3. -3. -1 -1",2
"ocean_model", "umo",  "umo",      "ocean_Pacific_undercurrent",  "all", "mean", "-155. -155. -3. -3. -1 -1",2
"ocean_model", "uo",   "uo",       "ocean_Pacific_undercurrent",  "all", "mean", "-155. -155. -3. -3. -1 -1",2
"ocean_model", "e",    "e",        "ocean_Pacific_undercurrent",  "all", "mean", "-155. -155. -3. -3. -1 -1",2

Should be -3 to 3 for the latitude range.

Similarly, the section transport analysis scripts should only sum positive values down to a depth of 400 m.

Two new sub-modules in MOM6-examples/src/

Two new repositories were recently added to the NOAA-GFDL GitHub site, land_null and atmos_null, that are used in the ice-ocean configurations. Previously we had to supply these components as supplementary code downloaded in tar files. Now, everything needed build a SIS2-based ice-ocean model can be obtained via GitHub.

This only affects the ice-ocean builds - it does not affect the ocean-only build instructions. I've updated the cloning and building instructions - I'm still testing the instructions so please update if I've missed anything. Fully coupled models still need to access code from inside the GFDL firewall on gitlab.

ATM namelists and data files changes required for using ulm_201505 patch forward

In ulm patch (ulm_201505) the ATM radiation input files were renamed in the ATM code (hard-coded).
In order to use this patch and forward please copy the following files to somewhere accessible to mom6 examples so that they can
be used when we want to switch to use ulm_201505 forward.

/lustre/f1/unswept/Niki.Zadeh/archive/input/ulm/ulm_201505_BW_new_sea_esf_aug2013/cns*

These are the same old cns files, only renamed. They don't cause answer changes (if we pick up
all the required namelists mods). I just checked this for MOM6_CM3G

Also, the following namlists changes are required:

     <namelist name="cloud_spec_nml">
       cloud_type_form = 'strat'
     </namelist>
     <namelist name="random_number_streams_nml">
    do_legacy_seed_generation = .true. ! siena_201303
        force_use_of_temp_for_seed=.false.
     </namelist>
        <namelist name="radiation_driver_nml">
       rad_time_step= 10800,
       rad_package = 'sea_esf',
       do_clear_sky_pass=.true.,
       renormalize_sw_fluxes=.true.,
       zenith_spec = 'diurnally_varying'
       use_co2_tracer_field = .true.
       using_restart_file = .false.
     </namelist>
     <namelist name="radiation_driver_diag_nml">
      all_step_diagnostics = .true.,
     </namelist>
         <namelist name="radiative_gases_nml">
       verbose = 3
       gas_printout_freq = 240

       time_varying_co2 = .false.,! keep false for historical co2 to avoid fatal error message: rad co2 will be overidden
       co2_variation_type = 'linear',
       co2_dataset_entry = 1990, 1, 1, 0, 0, 0,
       co2_specification_type = 'time_series',
       co2_floor = 100.0E-06,
       co2_ceiling = 1600.0E-06,
       co2_data_source = 'input'

       time_varying_ch4 = .false.,
       ch4_variation_type = 'linear'
       ch4_dataset_entry = 1990, 1, 1, 0, 0, 0,
       ch4_specification_type = 'time_series'
       ch4_data_source = 'input'

       time_varying_n2o = .false.,
       n2o_variation_type = 'linear'
       n2o_dataset_entry = 1990, 1, 1, 0, 0, 0,
       n2o_specification_type = 'time_series'
       n2o_data_source = 'input'

       time_varying_f11 = .false.,
       f11_variation_type = 'linear'
       f11_dataset_entry = 1990, 1, 1, 0, 0, 0,
       f11_specification_type = 'time_series'
       f11_data_source = 'input'

       time_varying_f12 = .false.,
       f12_variation_type = 'linear'
       f12_dataset_entry = 1990, 1, 1, 0, 0, 0,
       f12_specification_type = 'time_series'
       f12_data_source = 'input'

       time_varying_f113 = .false.,
       f113_variation_type = 'linear'
       f113_dataset_entry = 1990, 1, 1, 0, 0, 0,
       f113_specification_type = 'time_series'
       f113_data_source = 'input'

       time_varying_f22 = .false.,
       f22_variation_type = 'linear'
       f22_dataset_entry = 1990, 1, 1, 0, 0, 0,
       f22_specification_type = 'time_series'
       f22_data_source = 'input'

       calc_co2_tfs_on_first_step = .false.,
       calc_co2_tfs_monthly = .true.,
       co2_tf_time_displacement = 0.0,

       calc_ch4_tfs_on_first_step = .true.,
       calc_ch4_tfs_monthly = .false.,
       ch4_tf_time_displacement = 0.0,

       calc_n2o_tfs_on_first_step = .true.,
       calc_n2o_tfs_monthly = .false.,
       n2o_tf_time_displacement = 0.0,

     </namelist>
     <namelist name="shortwave_driver_nml">
       do_cmip_diagnostics = .true.,
       swform = 'esfsw99'
       time_varying_solar_constant = .false.,
       solar_dataset_entry = 1990,1,1,0,0,0,
      </namelist>

     <namelist name="sealw99_nml">
       do_thick = .false.,
       do_nlte = .false.,
       do_lwcldemiss = .true.,
       continuum_form = 'ckd2.1',
       linecatalog_form = 'hitran_2000',
       verbose = 5
     </namelist>

CVMix single column time-depth figures

@Hallberg-NOAA noted that the time-depth figures for the single column tests have the diffusivity and temperature fields not centered properly. For example, Kd_interface should be centered at depth zi. It is presently centered at zl. The converse is the case for temperature.

@adcroft-gfdl knows how to correct this problem, though will take some testing to get it.

We need to update the iPython notebooks for these tests with the corrected method.

add_diurnal_sw

In MOM6-examples, add_diurnal is set to true by default for ice_ocean_SIS/GOLD_SIS_025 and ice_ocean_SIS/OM4_025 cases.
All other setups (with SIS2) have this value false by default.
We want ADD_DIURNAL_SW to be True for ocean_ice runs (eg with CORE2 forcing).

Thanks,
Jasmin

Volunteers needed to follow wiki instructions

The wiki pages have been significantly re-factored (thanks to @nicjhan). We would appreciate it if some brave souls from both GFDL and outside could follow the latest install/build/running instructions to check that everything is correct.

  • Corrections to the wiki are very welcome (you can all edit the wiki).
  • Feedback/suggestions via comments on this issue are also welcome.

Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.