two-way nested simulation blowing up with MPI

General scientific issues regarding ROMS

Moderators: arango, robertson

Post Reply
Message
Author
xupeng66
Posts: 79
Joined: Sat Mar 06, 2010 3:38 pm
Location: University of Washington

two-way nested simulation blowing up with MPI

#1 Unread post by xupeng66 »

Hello,

I have done an idealized simulation to test the performance of two-way nesting in allowing surface waves to propagate across the nesting interface. As shown in the attached animation, my domain includes a small grid embedded in a large grid. The grid spacing (800 m) in the small domain is five times smaller than the spacing (4000 m) in the large domain. The simulation is 3D with linear background temperature stratification over a flat bottom. At the start of the simulation, a Gaussian-shaped mound in sea-surface elevation collapses, generating outward going surface gravity waves that passes across the small domain and eventually through the open boundaries of the large domain, where the Chapman plus Shchepetkin conditions are applied to free-surface and barotropic velocities while radiation conditions are applied to baroclinic velocities and temperature.

As can be seen from the animation, the results are very encouraging in that the waves pass through the nesting interface very smoothly despite the large difference in grid spacing between the two domains. However, I found that the simulation blew up very quickly with potential and total energy turning NaNs when the two domains are segmented for use of more than 1 processor with MPI. Also, the blowup seems to occur more quickly when there are more segments. I think this suggests there is some error in domain segmentation because the simulation runs without a hitch when only a single processor is used. My grid sizes are 128 by 128 for the large domain and 145 by 160 for the small domain, all numbers are for the interior rho points. I am wondering if the blow up occurred because the number of grid points (e.g., 145) is not dividable by the number of segments (e.g., 2) in that dimension.

I am curious to hear your thoughts on this blow-up issue.

Thanks a bunch!
Attachments
gauss_mound_eta_nested.gif
gauss_mound_eta_nested.gif (1.12 MiB) Viewed 4611 times

xupeng66
Posts: 79
Joined: Sat Mar 06, 2010 3:38 pm
Location: University of Washington

Re: two-way nested simulation blowing up with MPI

#2 Unread post by xupeng66 »

Any idea?

User avatar
arango
Site Admin
Posts: 1350
Joined: Wed Feb 26, 2003 4:41 pm
Location: DMCS, Rutgers University
Contact:

Re: two-way nested simulation blowing up with MPI

#3 Unread post by arango »

The reason why nobody replayed to your post is that you didn't provide any concrete information about the ROMS or Fortran error. We are not magicians or have any potions to see the error written to your standard output.

I have mentioned several times over the years in this forum to never never never have a ROMS application with same number of points in both directions :!: You may transpose matrices when preparing the input files for ROMS. Especially in nesting where the connectivity is computed in Matlab. In the past, users have transpose pm, pn, h, and f to very surprising and unphysical results.

xupeng66
Posts: 79
Joined: Sat Mar 06, 2010 3:38 pm
Location: University of Washington

Re: two-way nested simulation blowing up with MPI

#4 Unread post by xupeng66 »

Thanks a lot! I have finally found time to come back to the simulations. Yes, changing to use different grid points in X and Y directions did the trick. The simulation now runs without a hitch in both serial and parallel modes. This is an important lesson for me going forward. I will also remember to post the error message in the future.

Post Reply