ROMS UNSW2008: Difference between revisions

From WikiROMS
Jump to navigationJump to search
Line 325: Line 325:
===Browse the latte_c.h options===
===Browse the latte_c.h options===


Use vi or more to browse the CPP options used in this "realistc" application. Things to notice are:
Use <span class="forestGreen">vi</span> or <span class="forestGreen">more</span> to browse the CPP options used in this "realistic" application. Things to notice that are different from <span class="red">upwelling.h</span> are:
 
:*There are no <span class="blue">ANA_INITIAL</span> or <span class="blue">ANA_WINDS</span> options. If analytical initial and forcing options are not set with a <span class="blue">#define</span> then ROMS defaults to reading this information from input netcdf files.
 
:*<span class="blue">WEST_, EAST_, SOUTH_</span> and <span class="blue">NORTHERN_WALL</span>. These options define the open boundary schemes. (The nomenclature of the compass points assume a grid oriented with west along ''i=1'' and south along ''j=1''). The upwelling case had closed boundaries to the north and south, and periodic conditions east-west. The open boundary conditions here are set with the following options:
 
:*<span class="blue">WEST_FSCHAPMAN,WEST_M2FLATHER</span><br>etc. indicate “west” side free surface (FS) and depth-averaged velocity/momentum (M2)
 
:*<span class="blue">WEST_M3GRADIENT,WEST_TGRADIENT</span><br>“west side” 3-d velocity/momentum (M3) and all tracers.
 
:*<span class="blue">SSH_TIDES, UV_TIDES</span> <br>cause ROMS to add tidal variability in sea level and depth-average velocity using the harmonics read from the tides forcing file. See the wikiROMS entry on tides for more information.
 
:*<span class="blue"><nowiki>#</nowiki>ifdef SSH_TIDES<br>        #define ADD_FSOBC</span><br>This construct is a conditional test that causes prescribed mean or slowly varying sea surface height to be added to the tidal variability. <br> The further option <span class="blue">ANA_FSOBC</span> means the prescribed value is set by one of the analytical functional include files. If <span class="blue">ANA_FSOBC</span> is not defined, ROMS will look for the boundary sea level in a boundary conditions file.
 
:*<span class="blue">BULK_FLUXES</span><br>In this application the surface meteorology forcing files give net shortwave and longwave radiation and the temperature, pressure and humidity conditions in the marine atmospheric boundary layer. These values are converted to air-sea fluxes of heat and momentum according to the Fairall et al. bulk formulae.
 
:*<span class="blue">GLS_MIXING</span><br>This activates the Generalized Length Scale vertical turbulence closure parameterization of Umlauf and Buchard. Parameters in ocean_lattec.in determine details such as whether the closure method is actually k-epsilon, k-kl, etc.
 
:*<span class="blue">UV_PSOURCE, TS_PSOURCE</span><br>These options activate point sources; in this case the inflow of the Hudson River.


===Browse input netcdf files===
===Browse input netcdf files===

Revision as of 07:21, 25 March 2009

Installing and Running ROMS for First Time Users

A tutorial for new ROMS users will be held at the UNSW Computer Labs on Monday 30 March 2009, immediately prior to the ROMS Sydney 2009 User Workshop at the Sydney Institute of Marine Sciences, 31 March to 2 April 2009.

NoteThis tutorial is intended for complete newcomers to ROMS. It assumes basic knowledge of working in a UNIX environment, and that the essential components required to compile and execute ROMS are already installed on the host computer network. This wiki page borrows heavily from David's Robertson's excellent Installing ROMS under Cygwin tutorial where you will find more information about setting up the required computing environment (compilers, libraries etc.) for ROMS.

In this tutorial, we cover how to download the code, configure it for an application, and run the model. Error messages that arise during the configuration process will be explained so that these can better be debugged when users return to their home institutions and try to work through this process again.

A follow-on tutorial discussing sediment in ROMS will be presented on Friday 03 April.

NoteAn important resource you should use as you get started is the Frequently Asked Questions entry in WikiROMS.


Download ROMS

The disk space available on the UNSW Computer Lab machines is quite limited, so for the purposes of this tutorial we have downloaded the ROMS source code to /srv/ckpt/roms/shared/src on host matht001. Instructions below will explain how to point the build.bash script that compiles ROMS to this directory.

To download the code to your own machine, these are the steps you would follow:

  • You must have already registered on the ROMS portal and obtained your ROMS username/password as indicated in the Register. If you are also interested in access to research branches of the sediment code, indicate an interest in the Community Sediment Transport Modeling System (CSTMS) while you are registering. Once your registration is accepted, or if you are already registered, ask Dave Robertson (robertson at marine.rutgers.edu) to add you as a CSTMS user.
  • Create a src folder where you will keep the ROMS source code. You can place this wherever you wish in your directory tree (here we assume under your home directory "~") and name it whatever you like.
cd ~
mkdir src
  • Check out the ROMS source code replacing bruce with the ROMS user name you registered with.
svn checkout --username bruce https://www.myroms.org/svn/src/trunk src
Note the target directory src at the end of the command. If your code ends up in the wrong place, you may have omitted this.

You will see many lines stream by indicating the files that are being added to your src directory. When it finishes, you can type ls src to see the contents of the directory.

To see the contents of the directory where the code is downloaded for this tutorial, type this:

cd /srv/ckpt/roms/shared
ls src

Customize the Build Script

The ROMS source code comes with a build script in the ROMS/Bin directory. Examples written with bash (build.bash) and csh (build.sh) are provided. The UNSW Computer Lab machines are configured to use bash as the default login shell, so we will work with build.bash. A full description of the build script can be found here.

  • In your home directory (you can use some other directory to organize your ROMS projects if you wish) create a new folder named Projects and change into it.
cd ~
mkdir Projects
cd Projects
  • Create a folder named upwelling and change into it. ROMS is distributed with several Test Cases and the Upwelling example is the default which we will compile and run here.
mkdir upwelling
cd upwelling
  • Copy the build.bash file distributed with ROMS to your Projects/upwelling directory.
cp /srv/ckpt/roms/shared/src/ROMS/Bin/build.bash .

Next we need to configure a few options inside build.bash so that it finds the directories where the source code and your Project are located.

  • Open the build.bash script you just copied into your upwelling directory using your preferred text editor, e.g. vi.
vi build.bash
  • Scroll down until you find ROMS_APPLICATION. You will notice it is set as follows:
export ROMS_APPLICATION=UPWELLING
We do not need to change this. But this is the first thing you will alter when starting your own project. This tells ROMS the name of an include file that will contain all the directives to the C-PreProcessor to configure your application at compile time. ROMS' rule is change this string to lowercase and append a ".h", so this will search for a file called upwelling.h. It must be in the directory specified by MY_PROJECT_DIR:
  • Scroll down until you find MY_PROJECT_DIR and set it as follows:
export MY_PROJECT_DIR=${HOME}/Projects/upwelling
This obviously assumes you put Projects/upwelling under your home directory.

If you frequently move your ROMS project between hosts where you have a different directory structure, e.g. a temporary scratch space, you can use the MY_ROOT_DIR variable to minimize the changes you make to build.bash.

  • For example:
export MY_ROOT_DIR=/usr/scratch/bruce
export MY_PROJECT_DIR=${MY_ROOT_DIR}/Projects/upwelling

Next we tell build.bash where to find the ROMS source code downloaded from the svn repository (which you can keep up to date the svn update command - see more on this at LINK ). Note that most of the source code changes you make to customize ROMS will be made in your Projects space, and need not be made to the downloaded code directly. We will discuss exceptions to this during the tutorial, and how source code modifications interact with svn.

  • Set MY_ROMS_SRC to the location of the source code:
export MY_ROMS_SRC=/srv/ckpt/roms/shared/src
In practise, you will probably do something more like this:
export MY_ROMS_SRC=${MY_ROOT_DIR}/src
assuming this is the relative path in which you keep your source code on the various machines you work on.

Make sure that MY_CPP_FLAGS is not set. Sometimes this is set in the distributed build.bash exmaple. Comment out options with the # symbol like so:

#export MY_CPP_FLAGS="-DAVERAGES"

The UNSW Computer Lab machines are single core, so we need to tell build.bash not to assume MPI parallel compilation.

  • Comment out the options for USE_MPI and USE_MPIF90
#export USE_MPI=on
#export USE_MPIF90=on
  • If you were compiling in parallel you would leave the default entries in build.bash.
export USE_MPI=
export USE_MPIF90=
  • We leave the compiler option as the default because this says use the ifort (Intel FORTRAN) compiler which is what we want on the UNSW machines.
export FORT=ifort
  • In the interests of speed for this tutorial, we turn off compiler optimization by activating the debug option:
export USE_DEBUG=on
On the UNSW Lab machines compiling with optimization on will take over 15 minutes, but with optimization off (USE_DEBUG=on) it will be less than 60 seconds.

Save and close the build.bash file.


Copy the input and CPPDEFS options files

We need three more files in Projects/upwelling to configure and run ROMS. We copy the versions downloaded with svn because these are files you will work with locally when you experiment with changes to the test case example configuration.

  • Copy files ocean_upwelling.in, varinfo.dat and upwelling.h into the Projects/upwelling directory you just created.
cd ~/Projects/upwelling
cp /srv/ckpt/roms/shared/src/ROMS/External/ocean_upwelling.in .
cp /srv/ckpt/roms/shared/src/ROMS/External/varinfo.dat .
cp /srv/ckpt/roms/shared/src/ROMS/Include/upwelling.h .

View the file upwelling.h. It contains all the C-Pre-Processor (CPP) options that the compiler interprets to activate certain source code options within ROMS.

View the file ocean_upwelling.in. It contains the inputs options that ROMS reads from standard input at run time to set options that need not be fixed at compile time.

View varinfo.dat. The file varinfo.dat contains descriptions of the names and attributes of input and output variables that ROMS reads and writes from netcdf files. For most applications you will not need to change the entries in this file. If you need to know the default units assumed for different variables, those are noted in this file. (Before we run ROMS, we will need to tell it where to find this file).

Now we are ready to compile ROMS by executing the build.bash script.

Compile ROMS

Before you run ROMS, you need to compile it to create an executable oceanS file (S for serial or single processor computer), or oceanM file (if using MPI on a parallel computer).

  • Go to your upwelling project directory:
cd ~/Projects/upwelling
  • Then type:
./build.bash
  • If lots of stuff comes on the screen then compilation is proceeding, and make take some time.
  • If the build process ends quickly with an error, then it is likely that build.bash does not point to the correct location for the upwelling.h file, the FORTRAN compiler, or some libraries. We describe common getting started errors and solutions in the next section.
  • You may give the option -j to the build command to distribute the compilation to multiple processors if your host supports this, e.g.:
./build.bash -j 8
to compile on 8 processor at once.

If your build was successful it will not have reported any errors, and there will be an executable file in your Projects/upwelling directory called oceanG. The "G" in the file name indicates build.bash activated the USE_DEBUG option.

If USE_DEBUG were not selected, the executable would be oceanS, where the "S" indicates "serial" or "single-processor" because we deactivated MPI.

If you had activated MPI with the USE_MPI option the executable would be named oceanM.

(See also FAQ: My build finished with no errors, where is the ROMS executable?).


Common getting started compile error messages

Getting past the first few errors with compilation is often tricky. Carefully read any error messages you get for clues on what might be wrong with your configuration. Here are some common difficulties new users encounter getting started when first executing the build.bash command.

  • Compilers/../ROMS/Include/cppdefs.h:709:22:
    error: /student/0/a0000020/Projects/upwelling/upwelling.h: No such file or directory
    This says the file upwelling.h is not where Build expects it to be, which is in MY_PROJECT_DIR. You set this to ~/Projects/upwelling.
  • cp: cannot stat `/opt/intelsoft/netcdf/include/netcdf.mod': No such file or directory
    This says that netcdf is not where build.bash expects to find it. Locate where the netcdf include and lib directories with steps something like:
    cd /usr
    find . -name netcdf.mod -print
    ./local/netcdf-3.6.2/include/netcdf.mod
    ./local/netcdf/intel/3.6.3/include/netcdf.mod
    This tells us the most recent (3.6.3) netcdf is in /usr/local/netcdf/intel/3.6.3. Direct ROMS to this location by making two changes to build.bash. First, advise ROMS to read your changes to the default library path by uncommenting the option for USE_MY_LIBS.
export USE_MY_LIBS=on
Then specifiy the correct location for netcdf:
export NETCDF_INCDIR=/usr/local/netcdf/intel/3.6.3/include
export NETCDF_LIBDIR=/usr/local/netcdf/intel/3.6.3/lib
Warning Be careful where you make this change. You need to make it for the ifort compiler option, and NOT for the USE_NETCDF4 option (we are using netcdf-3). If you've done this correctly, your compilation with build.bash should now succeed.
  • error
    error:
    Note here further errors we encounter during the tutorial.


Run ROMS

You run ROMS by executing the oceanG (or oceanS) binary, giving it the ocean_upwelling.in file as UNIX standard input.

./oceanS < ocean_upwelling.in
ROMS standard output will be typed to the screen. To save it a file instead, enter, e.g.:
./oceanS < ocean_upwelling.in > my_upwelling.log

If you have compiled a parallel (MPI) executable, the syntax for running the mode is slightly but critically different. The ocean_upwelling.in file is no longer read from UNIX standard input (it has handled by all the MPI threads) so the "<" disappears from the command, and you need the correct syntax on your UNIX host for running an MPI process. It is probably something like:

mpirun -np 8 ./oceanM ocean_upwelling.in > my_upwelling.log
where the "-np 8" indicates use 8 processors and this number of tiles must have been set by

(See also FAQ: What do I have to do to runs ROMS?).


Common getting started run error messages

bash: oceanG: command not found
The working directory is not in your UNIX path. That's why we type "dot-slash" in front of the commands above.


Successful execution

Standard Output

When ROMS runs it will type a lot of information to UNIX standard output. This is the "logfile" you named following the ">", or your terminal if you did not redirect stdout.

STDOUT shows the following:

  • UNIX process info, run time, run TITLE
Process Information:
Thread # 0 (pid= 4449) is active.
Model Input Parameters: ROMS/TOMS version 3.2
Monday - March 23, 2009 - 10:02:39 AM
-----------------------------------------------------------
Wind-Driven Upwelling/Downwelling over a Periodic Channel
  • OS, compiler information, SVN version, and your MY_ROMS_SRC, MY_HEADER_DIR and ROMS_APPLICATION settings
Operating system : Linux
CPU/hardware  : i686
Compiler system  : ifort
Compiler command : /usr/local/intel/fc/10.1.021/bin/ifort
Compiler flags  : -heap-arrays -ip -O3 -pc80 -xW -free

SVN Root URL  : https://www.myroms.org/svn/src/trunk
SVN Revision  : 333
Local Root  : /srv/ckpt/roms/shared/src
Header Dir  : /student/0/a0000020/srv/Projects/upwelling
Header file  : upwelling.h
Analytical Dir: /student/0/a0000020/srv/Projects/upwelling

Resolution, Grid 01: 0041x0080x016, Parallel Threads: 1, Tiling: 001x001
Check that these are what you intended. In last line above
  • "Grid 01" pertains to future ROMS developments with multiple nested/connected grids,
  • 0041x0080x016 shows the grid size is 41 x 80 x 16 grid points in the K,J,I directions
  • The Parallel/Tiling message shows you are using a single process and a single domain tile. When using MPI, this message will describe how many tiles you are using and the MPI processes assigned.
  • Input parameters set in ocean_upwelling.in
Physical Parameters, Grid: 01
=============================

288 ntimes Number of timesteps for 3-D equations.
300.000 dt Timestep size (s) for 3-D equations.
30 ndtfast Number of timesteps for 2-D equations between
each 3D timestep.
...
Output Averages File: ocean_avg.nc
Output Diagnostics File: ocean_dia.nc
  • Then some more about the tiling when running in parallel
  • The C-PreProcessor (CPP) flags set in upwelling.h but AS MODIFIED by ROMS when interpreting and checking the selected CPP options.
Activated C-preprocessing Options:

UPWELLING Wind-Driven Upwelling/Downwelling over a Periodic Channel
ANA_BSFLUX Analytical kinematic bottom salinity flux.
ANA_BTFLUX Analytical kinematic bottom temperature flux.
ANA_GRID Analytical grid set-up.
ANA_INITIAL Analytical initial conditions.
...
You should check that the CPP options displayed here agree with what you intended. For example, if you inadvertently specify more than one horizontal advection scheme option, ROMS will have chosen only one and reported that option here.
  • The preamble in STDOUT continues with information about the space and time discretization: grid spacing, grid volume, Courant number (time step stability) and stiffness (related to s-coordinate accuracy).
  • Then the model starts time stepping:
NL ROMS/TOMS: started time-stepping: (Grid: 01 TimeSteps: 00000001 - 00000288)

STEP Day HH:MM:SS KINETIC_ENRG POTEN_ENRG TOTAL_ENRG NET_VOLUME

0 0 00:00:00 0.000000E+00 6.579497E+02 6.579497E+02 3.884376E+11
DEF_HIS - creating history file: ocean_his.nc
WRT_HIS - wrote history fields (Index=1,1) into time record = 000000
DEF_AVG - creating average file: ocean_avg.nc
DEF_DIAGS - creating diagnostics file: ocean_dia.nc
1 0 00:05:00 3.268255E-13 6.579497E+02 6.579497E+02 3.884376E+11
2 0 00:10:00 6.503587E-12 6.579497E+02 6.579497E+02 3.884376E+11
3 0 00:15:00 4.592307E-11 6.579497E+02 6.579497E+02 3.884376E+11

...
This output indicates several things:
  • the run is programmed to run from time step 1 to 288
  • the run starts at time 00:00:0
  • netcdf output HISTORY, AVERAGES and DIAGNOSTICS files are created. Every time ROMS creates a new netcdf file, and writes to an existing file, it reports this to STDOUT
  • output is written to the HISTORY file
  • then global quantities related to the model KE, PE and domain volume are reported on each time step

Note In 99% of situations, getting started problems with model set-up and configuration can be diagnosed by carefully reading the STDOUT above. Things to look for are:

  • misconfigured CPP options (what you got is not what you thought you asked for)
  • parameter errors (e.g. you activated horizontal mixing but left the coefficient as zero)
  • misnamed output files (that's why the files from your last run got overwritten)
  • irrational choices of grid spacing or time step
  • initial/boundary/forcing data being read from the wrong file, or not read at all (because you selected analytical conditions)

At the conclusion of the run, ROMS reports information about run time:

Elapsed CPU time (seconds):

Thread # 0 CPU: 108.079
Total: 108.079

Nonlinear model elapsed time profile:

Initialization ................................... 0.016 ( 0.0148 %)
Processing of input data ......................... 0.028 ( 0.0259 %)
Processing of output time averaged data .......... 4.312 ( 3.9899 %)
...
  • about the number of output records written to each file
ROMS/TOMS - Output NetCDF summary for Grid 01:
number of time records written in HISTORY file = 00000005
...
  • and the analytical files included
Analytical header files used:

ROMS/Functionals/ana_btflux.h
ROMS/Functionals/ana_grid.h
...
If you used a modified analytical file in your MY_HEADER_DIR it will be reported here and is another thing you should check for consistency with your intentions.


Netcdf file output

As reported above, ROMS created 4 output netcdf files when it ran. There are ocean_his.nc, ocean_avg.nc, ocean_dia.nc, and coean_rst.nc. These are, respectively:

  • history records or 'snapshots' of the model state a selected time intervals
  • averages of the model state over selected intervals (not necessarily the same intervals as the history)
  • diagnostics of the model state, the precise contents of which are controlled by CPP options
  • a restart file with everything ROMS needs to restart a application. This is useful if your job crashes at some point and you want to recommence from a previous state without starting over. Typically the restart file is set to keep just 2 time records by continually over-writing the oldest as the run proceeds. This behaviour is controlled in ocean_upwelling.in. Also, when ROMS "blows up" it dumps the ocean state to a 3rd record in the restart file.

You can browse the contents of netcdf files at the UNIX command line with the command ncdump, e.g.

ncdump -h ocean_his.nc | more

Note the use of the "-h" option. This restricts the output from ncdump to be just the header information, or metadata, in the netcdf file. Without the "-h" option you will get the entire contents of the file converted to ascii.

Things to notice when you ncdump the ocean_his.nc file are that it contains all the input parameters (time step, mixing coefficients, s-coordinate parameters, etc.) from ocean.in, the model grid coordinates (x, y, lon, lat, depth, Coriolis parameter, etc.) which may have been computed by the ANA_GRID option or read from an input grid netcdf file, in addition to the actual model output (ocean_time, zeta, u, v, temp, salt).

There are netcdf global attributes that echo much of the information typed to STDOUT. This includes compiler, svn version, and project directory information, and all the CPP options. This is a valuable source of information when returning to a project and trying to figure out what you did! The global attributes metadata show precisely which options were activated when creating the output in this netcdf file.


Changing the UPWELLING test case configuration

Compile time changes: upwelling.h

Changes to options that must be set at compilation time are made to the upwelling.h file. These settings are interpreted during the C-PreProcessing step.

To see what the present options are, edit the upwelling.h file:

vi upwelling.h

Recall that the actual options active after this file in interpreted will be typed to STDOUT (the "logfile") and also written to the output netcdf file in the global attributes.

To see the all the options that might be set using C-PreProcessor directives, you can browse the cppdefs.h file in the ROMS/Include directory underneath the MY_ROMS_SRC location set in your build.bash. In this case:

cd /srv/ckpt/roms/shared/src
more ROMS/Include/cppdefs.h

The contents of this file are almost entirely comments and are provided to document the options available. For more information consult WikiROMS or the User Forum.

At the very bottom of cppdefs.h you will see a short code segment that loads the actual application options from ROMS_HEADER. This variable is set by the ROMS_APPLICATION value in build.bash.


Run time changes: ocean_upwelling.in

Changes to options that are set at run time are made to the ocean_upwelling.in file.

To see what the present options are, edit the ocean_upwelling.in file:

vi ocean_upwelling.in

Comments at the beginning of this file document the KEYWORD == value syntax.

Comments at the end of file provide brief summaries of what each parameter does.

For more information consult WikiROMS or the Use Forum.

Recall that the actual parameter values ROMS uses after reading this file will be typed to STDOUT (the "logfile") and also written to the output netcdf files.

A Realistic model example: LaTTE

This section of the tutorial assumes you have successfully compiled and run the UPWELLING example above. Key concepts you should be comfortable with before you proceed are:

  • you need a new directory where you will keep the files specific to the new application
  • customize build.bash for the new application (copy the build.bash from upwelling because it has all the correct compiler and library settings)
  • set MY_PROJECT_DIR in build.bash to point to the directory for the new application
  • set ROMS_APPLICATION in build.bash to the correct name for the new application
  • if you wish to customize any of the ana_*.h files, copy just the ones you need into the new project directory
  • you don't need to make a copy the source code a new application

With these concepts in mind, we proceed by configuring ROMS to run a realistic coastal ocean application that includes open boundaries on 3 sides, open boundary tides and climatological open boundary velocity and tracer (temperature and salinity) conditions, surface meteorological forcing, and initial conditions, all provided by input netcdf files.

The example is called LaTTE_C because it simulates ocean conditions during the Lagrangian Transport and Transformation Experiment conducted on the New Jersey inner shelf in the Spring of 2006. The '_C' denotes a coarse resolution configuration suitable for this training exercise.


Create a latte_c project directory

We have placed the CPP options file latte_c.h, standand input ocean_lattec.in, and a modified varinfo.dat in /srv/ckpt/roms/shared/latte_c/Forward. Make a new Project directory for this new application and copy these 3 files. DO NOT copy all the netcdf files from /srv/ckpt/roms/shared/latte_c/in .

Edit build.bash

Set the correct entries for environment variables that define the user application.

ROMS_APPLICATION=LATTE_C causes build.bash to look for the file latte_c.h in order to set the CPP options

MY_PROJECT_DIR=${HOME}/Projects/latte_c will instruct build.bash where to look for the latte_c.h file.

Setting MY_HEADER_DIR would instruct ROMS where to look for the user functional files ana_*.h that over-ride default options. In this example, however, we don’t actually need to modify any of those functionals. This is typical of “realistic” applications where input grid, initial and boundary conditions are provided from data in input netcdf files.

NoteThe format of ROMS output files (history, averages and restart) is the same as ROMS input initial conditions and climatology. This means the output of previous runs can become the initial conditions, or 3-D climatology (for nudging) for new runs.

Edit ocean_lattec.in

Open the ocean_lattec.in file in an editor. There are KEYWORDS that define the names of the input netcdf files for applications of this type:

  • GRDNAME is the grid file with coordinates, grid metrics (spacing), bathymetry, land/sea mask and Coriolis
  • ININAME is the initial conditions
  • BRYNAME are the open boundary sea leve, velocity and tracer conditions
  • FRCNAME are the tides, river source, and surface meteorological forcing files. Notice there are multiple files and the number of files ROMS is to read is set by the NFFILES parameter. On initialization, ROMS scans this list for each forcing variable it needs, using the first file to contain the necessary and shadowing any entry in subsequent files. Therefore if you want to re-run your model with a new set of wind data but happen to have other wind data in a files with all your other meteorology inputs, just put the new file at the beginning of the list.

There are KEYWORDS that determine the output file names. These are:

  • RSTNAME, HISNAME, AVGNAME, DIANAME, STANAME, FLTNAME .... etc

There are keywords that set how many time steps ROMS takes between writing output. These are:

  • NRST,NHIS,NAVG,NDIA,NSTA ...etc

You can have ROMS write multiple records to each output file at these intervals, but periodically create a new file (to keep file sizes manageable) by setting the keywords:

  • NDEFHIS,NDEFAVG,NDEFDIA


Compile with build.bash

If you have set the entries in build.bash correctly, you compile exactly as before by executing the script.

Watch the output of the build process. You should see that instead of "Project/upwelling/Build" the compilation is now writing temporary files to "Project/latte_c/Build". This Build subdirectory is kept separate so you can be working on two projects at once and not confuse things.

cd to the Build subdirectory and look at some of the files there, e.g.

cd ~/Projects/latte_c/Build
more u3dbc_im.f90

This is the file that sets open boundary conditions on 3-d velocity. This is the Fortran90 file that is generated after the C-PreProcessor has done its job. If you find your model is doing things you don't expect, it can be instructive to view the ".f90" file and corresponding ".F" (in MY_ROMS_SRC) to see whether the CPP options being processed are what you intended. If not, review your header file (in this case latte_c.h), and the list of CPP options typed to STDOUT. Don't edit the "f90" file directly because it gets over-written when you recompile.

Before we run latte_c

Browse the latte_c.h options

Use vi or more to browse the CPP options used in this "realistic" application. Things to notice that are different from upwelling.h are:

  • There are no ANA_INITIAL or ANA_WINDS options. If analytical initial and forcing options are not set with a #define then ROMS defaults to reading this information from input netcdf files.
  • WEST_, EAST_, SOUTH_ and NORTHERN_WALL. These options define the open boundary schemes. (The nomenclature of the compass points assume a grid oriented with west along i=1 and south along j=1). The upwelling case had closed boundaries to the north and south, and periodic conditions east-west. The open boundary conditions here are set with the following options:
  • WEST_FSCHAPMAN,WEST_M2FLATHER
    etc. indicate “west” side free surface (FS) and depth-averaged velocity/momentum (M2)
  • WEST_M3GRADIENT,WEST_TGRADIENT
    “west side” 3-d velocity/momentum (M3) and all tracers.
  • SSH_TIDES, UV_TIDES
    cause ROMS to add tidal variability in sea level and depth-average velocity using the harmonics read from the tides forcing file. See the wikiROMS entry on tides for more information.
  • #ifdef SSH_TIDES
    #define ADD_FSOBC

    This construct is a conditional test that causes prescribed mean or slowly varying sea surface height to be added to the tidal variability.
    The further option ANA_FSOBC means the prescribed value is set by one of the analytical functional include files. If ANA_FSOBC is not defined, ROMS will look for the boundary sea level in a boundary conditions file.
  • BULK_FLUXES
    In this application the surface meteorology forcing files give net shortwave and longwave radiation and the temperature, pressure and humidity conditions in the marine atmospheric boundary layer. These values are converted to air-sea fluxes of heat and momentum according to the Fairall et al. bulk formulae.
  • GLS_MIXING
    This activates the Generalized Length Scale vertical turbulence closure parameterization of Umlauf and Buchard. Parameters in ocean_lattec.in determine details such as whether the closure method is actually k-epsilon, k-kl, etc.
  • UV_PSOURCE, TS_PSOURCE
    These options activate point sources; in this case the inflow of the Hudson River.

Browse input netcdf files

Use ncdump -h to browse the contents of the forcing files and boundary conditions file.


ncdump -h frc_lattec_wrf_Lwrf.nc

ROMS associates the variable name of these forcing data with the appropriate internal variable by consulting the entires in variodo.dat

ncdump -h lattec_bndy_uv2d_half.nc

east west north south

Run the latte_c example

Plotting netcdf output with Matlab

Basics of Matlab-Netcdf

Matlab version 2008b, which is installed on the UNSW Computer Lab machines, has support for reading netcdf files. It uses the built-in java tools to do this and the necessity to install a machine-dependent set of mexnc routines that was required in earlier versions of Matlab is no longer necessary.

To ease the process of reading netcdf files into the Matlab workspace we have however installed a set of Matlab m-files called SNCTOOLS, written by John Evans at Mathworks. These tools should be in your Matlab path. You can verify this in Matlab by entering the command which nc_varget.

nc_varget is the workhorse utility that reads subsets of data from a netcdf file. Many of the Matlab tools that people distribute for working with ROMS output use SNCTOOLS functions, like nc_varget, to provide the interface to reading netcd files. You won't get much further in this part of the tutorial if you can't work nc_varget.

Enter help nc_varget to see the syntax.

Note that nc_varget also supports reading from OPeNDAP/THREDDS data servers in addition to reading from local netcdf files.

Using the roms_wilkin Matlab routines

There are a number of collections of code, including several written for Matlab, designed to provide convenient tools for plotting ROMS output. What you choose to use is a matter of personal preference and the functionality offered.

For this tutorial we show just a few tools out of the set of roms_wilkin Matlab tools descirbed in more detail at the tiddlywiki at http://romsmatlab.tiddlyspot.com and also in this thread on the ROMS forum.

On the UNSW Computer Lab machines you need to add the directory of roms_wilkin Matlab routines to your Matlabpath.

Start Matlab:

  • >> editpath


The {z,s,i,j}view.m routines in roms_wilkin make simple plots directly from a ROMS file or OPeNDAP URL by slicing along coordinate directions.

Enter help roms_zview to see the syntax.

For example:



>> g=roms_get_grid(file,file); % the grid structure >> % temperature slice for time step nearest to 20JUN2002, at 2-m >> % depth, with every 3rd velocity vector over-plotted >> roms_zview(file,'temp','20-Jun-2002',-2,g,3,.1,'k')


"File" could equally well have been an OPeNDAP URL to ROMS output, e.g. >> file='http://server/thredds/dodsC/roms/cblast/2002-050/averages' though you may find accessing this is a bit slow if everyone in class uses this at once from Sydney to New Jersey.