Bug in large spherical nested run?

Report or discuss software problems and other woes

Moderators: arango, robertson

Post Reply
Message
Author
jpringle
Posts: 107
Joined: Sun Jul 27, 2003 6:49 pm
Location: UNH, USA

Bug in large spherical nested run?

#1 Unread post by jpringle »

NEGLECT THIS MESSAGE -- THE PROBLEM WAS NOT WHERE I SUSPECTED IT WAS HERE. THERE IS SOME USEFUL INFORMATION BELOW ABOUT AREA CALCULATIONS WITH VARIOUS GRID PACKAGES, SO I AM NOT DELETING IT.

I have been debugging on and off for several weeks what appears to be a bug (or my stupidity) in nesting in a large domain spherical model run with nesting. I have a 1/7 degree idealized North Atlantic domain with a 1/21 degree western boundary region nested entirely within it. The northern, southern, and western boundaries of the refined grid are masked; only the eastern boundary of the refined grid is open to the ocean. The equation of state is linear, and set so rho=S-T. The depth is everywhere uniform at 4000 meters.

When I run the model with no forcing of any kind, and uniform density, the model blows up in less than a day when two way nesting is enabled (coarse step 123). When either grid is run alone, or with only one way coupling, the model is stable. These results are not sensitive to the magnitude of horizontal or vertical mixing, and dramatic reductions of time step do not reduce the model time needed for the model to go unstable (i.e. halving dt roughly doubles the number of timesteps needed for the model to go unstable). The instability is the same when the model is compiled with gfortran or ifort.

When the model is run with two nested grids with Cartesian coordinates and uniform grid spacing (with a factor of three refinement between coarse and fine), the model is stable.

I have checked all the grid metrics, and they appear to be correct. The coarse grid was made with John Warner’s mat2roms_mw.m, and the fine grid extracted with it by a stock version of coarse2fine.m. The error persists with model revision 782, downloaded Sept 8 2015.

The earliest problematic symptoms are shown when NESTING_DEBUG is enabled. Along the Western Boundary, the ratio of fine to coarse grid transport on the eastern boundary is 9.99276761E-01. When the model is run with the nested rectangular grids, the ratio is always 1. This transport mismatch is reflected in a visible wave in free surface moving across the coarse grid away from the location of the eastern edge of the fine grid in the first few timesteps (the high spatial-frequency noise in the nesting region appears to be truncation error in fine2coarse() and its feedback to the fine grid). This wave, shown below, is not present in the Cartesian grids or with one way coupling (the noise is present in all model runs). The picture is from the 3 rd coarse time step, the 9 th fine time step.

Image

Over time, a persistent zeta anomaly develops along the grid boundary that is strongest in the north, where the curvature of the grid is also strongest. These anomalies then feedback to the other fields. (However, it is puzzling why the curvature of the earth should effect the calculation of DU_avg2, since for this North/South and East/West oriented grid, the zonal grid spacing dy (and thus pn) does not vary meridionally. However, due to the way that coarse2fine.m interpolates in space, pn is not exactly uniform meridionally.)

I am at a loss as to how to debug this issue further. I have looked at put_refine2d until my head hurts, and the code that computes DU_avg2. I would appreciate any suggestions.

I have included the grid files, contact file and a zip files with the .in, .h, and ana_* files needed to run the simplest case. They will run against a clean version of the ROMS code. I have included a build.bash file for a mac running gfortran, but there is nothing custom in it besides the usual library and compiler changes, and the change of the model name to JPBASIN.

I would very much appreciate any help.

Jamie
Attachments
baseTwoGrids_contacts.nc
contact file
(20.16 MiB) Downloaded 239 times
ocean_grid_WB.nc
fine grid
(33.47 MiB) Downloaded 345 times
ocean_grid_SVD.nc
Coarse grid
(33.56 MiB) Downloaded 310 times
ModelFiles.zip
jpbasin.h, jpBasin.in, build.bash and ana_*h for model
(44.59 KiB) Downloaded 254 times
Last edited by jpringle on Fri Sep 11, 2015 10:50 pm, edited 1 time in total.

User avatar
arango
Site Admin
Posts: 1347
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Bug in large spherical nested run?

#2 Unread post by arango »

Thanks Jamie for the detailed information. I have to think about this one. Obviously, the spherical grid is doing something to it. It makes me suspicious of the area/volume conservation. I wonder if the grid spacing (area) between coarse and fine grids is conserved in the sherical case. I assume that this is a curvilinear spherical setup.

It will be worth to check the pm and pn between finer and coarser grids. Is the summed area (9-grid points) of the finer grid match the coarse grid that it contains it? If not, what it is the order of the mismatch?

It seems to me that your application is like a western intensification, right? Make sure that horizontal diffusion and viscosity scale correctly according to grid size between coarse and fine grids. We need to have the same effective viscosity and perhaps diffusion. Harmonic viscosity scales by the square of the grid size. Other thing to look is the wind stress. If using analytical functions, you need to have the same wind stress curl.

I will check your files when I have some time available.

jpringle
Posts: 107
Joined: Sun Jul 27, 2003 6:49 pm
Location: UNH, USA

Re: Bug in large spherical nested run?

#3 Unread post by jpringle »

Hernan -- I will compare the averaged metrics tomorrow, when I have time to code. A question and two points first.

1) It is a curvilinear spherical grids -- the delta-latitude and delta-longitudes are both constant, and the grid is north/south and east/west aligned.

2) There is no wind, so the wind and curl match perfectly on all grids :D

And a question. I do not think that horizontal viscosity/diffusion matter in this case. However, I am interested in how they should scale. I am using bi-harmonic diffusion and have set it so that the time-scale of dx-noise dissipation is the same: so the diffusivity and viscosity of the coarse grid are 3^4 = 81 times greater than those for the fine grid (the grid refinement is 3). HOWEVER: this means that the diffusive flux of a feature resolved on both grids will have a factor of 81 mismatch and the boarder of the two grids. Is this really ok?

Jamie

jpringle
Posts: 107
Joined: Sun Jul 27, 2003 6:49 pm
Location: UNH, USA

Re: Bug in large spherical nested run?

#4 Unread post by jpringle »

The area of the 9 rho cells of the fine grid is 0.144% greater than the area of the associated coarse grid rho cell, with a very weak dependency on latitude. This is of the same order of magnitude of the discrepancy of mass flux at the eastern boundary. I am investigating the origin of this issue. I suspect the issue is that the metrics are recomputed for the fine grid using a great circle distance measure. However, unless the rho points of the fine grid lie upon the great circle connecting the coarse points they are between, the sum of the distance between the rho points on the fine grid will not be equivalent to the distance between the rho points on the coarse grid.

This is only a hypothesis now, I am investigating.

Jamie

User avatar
arango
Site Admin
Posts: 1347
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: Bug in large spherical nested run?

#5 Unread post by arango »

I have a new version of the Matlab scripts that I haven't released yet. I am still thinking about them. They are experimental and I will send them to your e-mail latter today. It provides alternatives to computing the metrics via interpolation instead of the great circle formula.

Please sent me an e-mail so I can replay to you with the updated scripts.

jpringle
Posts: 107
Joined: Sun Jul 27, 2003 6:49 pm
Location: UNH, USA

Re: Bug in large spherical nested run?

#6 Unread post by jpringle »

Just to update -- the discrepancy in the area of grid cells between the fine and coarse grids was largely caused by an inconsistency between mat2roms_mw and coarse2fine. The former used sw_dist to compute distance between grid points, the later used gcircle. sw_dist uses the "plain sailing" method to estimate distance on a sphere, while gcircle uses a great circle. Switching mat2roms_mw to a great circle reduced the mismatch by 10^4. This reduced the error in the boundary mass flux by a similar amount.

Unfortunately, this fix did not improve the model behavior. :?

Jamie

Post Reply