MPI parallelization

General scientific issues regarding ROMS

Moderators: arango, robertson

Post Reply
Message
Author
mathieu

MPI parallelization

#1 Unread post by mathieu »

I put this post because I wonder if the MPI parallelization of ROMS could be improved.

The exchange code is in mp_exchange.F and contains typical pairs mpi_irecv/mpi_send. What about using the slightly more advanced MPI_SENDRECV function? A priori it moves some of the buffering to the MPI library and maybe some improvement could come from it.

The atmospheric model COSMO which is finite difference is using cartesian communicator, i.e. the MPI library knows the grid of the model. This could be important for some architecture where intercommunication speeds are non-uniform. Key commands are MPI_CART_RANK and friends.

Another approach used by the finite difference estuarine model GETM is to recognize when a tile of the model is formed only of land points and decide not to compute on it. This is especially useful in estuary situations or in the case of GETM for the Baltic Sea where 70% of grid points are land.

Another approach used in the wave model WAM is to shift the position of the tiles again in order to minimize the area taken by land. This is explained in some detail in Dynamics and modelling of ocean waves and could result in further improvements.

Was it considered to implement in ROMS such ideas? Some are mutually exclusive. Some requires extensive programming. But maybe something can be done and possibly one should discard the NtileI, NtileJ model and let ROMS build its own partition itself.
On the other hand the simplicity of the ROMS parallelization means that it compiles on virtually any MPI library.

User avatar
kate
Posts: 4091
Joined: Wed Jul 02, 2003 5:29 pm
Location: CFOS/UAF, USA

Re: MPI parallelization

#2 Unread post by kate »

Hernan has been working for quite some time towards a ROMS with nesting and patching. This would allow you to build grids that omit big areas of land. I'm not sure when we can expect to see it, though.

Post Reply