ROMS 2.1 Released

ROMS Code Release Announcements

Moderators: arango, robertson

Post Reply
User avatar
Site Admin
Posts: 1295
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University

ROMS 2.1 Released

#1 Unread post by arango »

Finally, I am releasing the long awaited ROMS/TOMS 2.1 version. We have been working on this version for several months. This is a very important upgrade to the model. There are several critical bug fixes that will affect your applications. We have discussed some of these bugs previously. Many thanks to several of you who reported them. Also, I appreciate the help that a few of you gave to develop or improve various parts of the model.

In particular, many thanks to:

Meinte Blass (Utrecht U., UCLA)
Paul Bissett (FERI)
Heather Crowley (SUNY)
Katja Fennel (Rutgers U.)
Mark Hatfield (NIWA)
Kate Hedstrom (ASRC)
Chris Moore (PMEL)
Daniel Schaffer (NOAA/FSL)
Alexander Shchepetkin (UCLA)
John Warner (USGS)
John Wilkin (Rutgers U.)

Sasha and I spent some time together consolidating Rutgers and UCLA codes. However, that version of the code is still in the debugging stage. It is only working for limited applications. It has a more elaborate and efficient kernel. It basically produces similar solution. However, the new kernel allows larger time-steps which become important on long model (climatological) integrations. We decided to have two branches of ROMS/TOMS for now: 2.x and 3.x development branches. The 3.x branch is the consolidated UCLA/Rutgers codes which will be released at later time. We need to maintain the 2.x branch because we have invested a lot of time building its tangent linear and adjoint models for variational data assimilation (strong and weak constrain 4DVAR), ensemble forecasting, and sensitivity analysis. We are hoping to release the tangent linear and adjoint codes in the near future. We are currently parallelizing those models.

New developments included in version 2.1:

(1) The Fasham model was completely rewritten and some of its terms have been reformulated. It now has a carbon cycle. Many thanks to Katja Fennel for helping us improve and test this module.

(2) We added a variable component (up to 84) bio-optical model. This model was developed originally by Paul Bissett. This model is still under testing.

(3) There is a new bottom boundary layer model developed in collaboration (MB_BBL) with Meinte Blaas and John Warner. Many thanks to Meinte for the development of this module and for his fixes and testing of the Styles and Glenn bottom boundary layer.

(4) The sediment model was expanded and now has stratigraphy. That is, there are now "Nbed" sediment bed layers. Many thanks to John Warner for improving and testing this module. We also unified the arrays used between sediment and bottom boundary layer models. The unification with the biology models will be done in the future. Three idealized examples were added to test several aspects of the sediment model. See options ESTUARY_TEST, LAKE_SIGNELL, and TEST_CHAN.

(5) Computation of momentum and tracer balances. Many thanks to Heather Crowley for her contribution. She carefully devised several ways to code this. This option can be activated with CPP switches DIAGNOSTIC_UV, DIAGNOSTICS_TS, and DIAGNOSTICS_BIO. There is a new NetCDF output file to store these balance terms. Its structure is generic and other terms can be added easily in the future.

(6) Added quadratic terms to the time-averaged output NetCDF file:<uu>, <uv>, <vv>, <uT>, and <vT> . These will become handy when computing averaged Reynolds stresses. This option is activated with CPP switch AVERAGES_QUADRATIC. Also, the surface fluxes of momentum and heat can be added to the time-averaged file using CPP switch AVERAGES_FLUXES.

(7) Added the capability to compute isobaric Lagrangrian trajectories. Many thanks to John Warner for coding this option. Now the floats can be computed as neutral density 3D Lagrangian particles (Ftype=0) or constant depth (isobaric) particles (Ftype=1). See input script for more details.

(8) The driver program of the model was modified to allow sequential and concurrent coupling to atmospheric models. Many thanks to Chris Moore and Daniel Shaffer for helping us to develop this capability. The main program is now called master and can be configured via CPP to a stand alone ocean or to a coupled atmosphere-ocean program. The coupled option is activated with the AIR_OCEAN switch. Currently, it is configured for a two-way coupling between WRF and ROMS. This is only possible in distributed-memory configurations. The atmosphere coupler uses the Modeling Coupling Toolkit (MCT), developed at the Argonne National Laboratory, and the WRF I/O API. Both atmosphere and ocean models are built as libraries. This coupling option is still in the early development stages. There are several technical issues that need to be addressed later. The atmosphere and ocean model are run according to the MPI processor rank. The communicator is split between atmosphere and ocean parallel nodes. Notice that in ensemble forecasting, the atmosphere and/or the ocean model is run over an ensemble loop. Also, in variational data assimilation the model is run over outer and inner loops. This requires a different structure than the one coded in air_ocean.F. In ensemble forecasting, a full atmosphere-ocean coupling is possible but each member of the ensemble needs to be run on different parallel nodes. Variational data assimilation (4DVAR) is more complicated and requires more thinking.

(9) There are several new routines, include file, and input scripts:

Code: Select all

    air_ocean.F                  wrf_io_flags.h
Critical Bugs fixed:

(1) Bug in horizontal viscosity. This one was reported before so I assume that you all took care of this one. See:


(2) Fixed parallel bug with periodic boundary applications. Sasha helped me fix this one. The solution was very simple but tricky (only set_bounds.h was changed). I am glad that we solved this problem; it has been in the code since its release last year :P. Now, we can have more than one partition in the periodic direction :!:. This also fixes the problem of tiling in serial configurations. Yes, you can also partition serial configurations. It turns out that in some computers, the code run even faster if you partition the domain and run in a single processor. Give it try an find for yourself. The solution is now identical regardless of the partitions.

(3) Fixed a critical bug on the point sources/sinks (rivers). You should get a different behavior here. The river mass transport was not specified in the time averaged arrays: DU_avg1 and DV_avg1 (see step2d.F). This is very important because these arrays are used in the barotropic/baroclinic coupling. Otherwise, the transport effect is lost :?:. I cannot believe that we missed this one. In any application using river runoff, you must specify both temperature and salinity. Otherwise, the solution with rivers is unstable. A good approximation for the surface river temperature can be satellite data. We still recommend to put the river transport in the upper cell. There is an interesting discussion about rivers as a boundary conditions in the forum. See:


(4) Fixed a MPI parallel bug in the restart of the floats.

Known Problems:

The only known problem right now is that in realistic applications the model blows up with the Intel Fortran Compiler (ifc), version 7.x. This is clearly a compiler bug. This bug was fixed in version 8 of the compiler (ifort).

Getting the Code or Patch:

You have the choice of applying the patch file (patch_roms-2.0) file to convert from ROMS 2.0 to 2.1 or getting the full tar (roms-2.1.tar.gz) for ROMS 2.1 from the web site.

This patch is designed to be applied to ROMS 2.0 using the GNU patch program. It is recommended to save your adapted ROMS version in case that the patch utility fails. Copy your code version to a temporary directory and apply the patch as follows:

% patch <patch_roms> log
% foreach f (*.F *.h *.in *.dat M* I*)
foreach? echo "ooooooooooooooooooooooooooooooooooooooooooooo" >> log
foreach? echo $f >> log
foreach? diff $f ../MyOtheDir/$f >> & log
foreach? end

All the differences between ./ and ../MyOtherDir codes is stored in the log file. Notice that I am put several ooos to separate between each file to facilitate reading and browsing.

Good luck and happy computing :)

Post Reply