Custom Query (964 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (367 - 369 of 964)

Ticket Owner Reporter Resolution Summary
#467 arango arango Fixed IMPORTANT, corrected a problem in mod_netcdf.F
Description

Corrected a problem in mod_netcdf.F when determining the size (Asize) of the array to be read. The value Asize is used to land/sea masking values (Aspval) or adding offset (Aoffset) when appropriate. The assignment of Asize depends if the optional arguments start and total are present in the routine family netcdf_get_fvar_xx.

This may also explain complains about why the _FillValue (>=1E+35) values were not removed completed from the boundary conditions during reading. We were not scanning the full compact 1D array do to the incorrect value of Asize.

I finally have time to look at this problem in the debugger and test with different compilers.

Many thanks to Pierre St-Laurent for reporting this problem and for his persistence to get to the bottom of this issue.

I also fixed a bug in interp_float.F. Around line 198, we need to have instead:

j2=MIN(MAX(Jr+1,1),Mm(ng)+1)

I also corrected few typos in the documentation. I have been carefully revising ROMS code for the last month or so.

#468 arango arango Fixed IMPORTANT: Corrected parallel bug in uv3dmix4_geo.h and friends
Description

Revisited routine uv3dmix4_geo.h and its TLM, RPM, and ADM versions again to correct parallel bugs. The NLM version now works in shared-memory and serial with partitions. The biharmonic operator for the stress tensor along geopotentials is very ticky!!! The good news is that I don't have to revisit these routines again. I already worked all the logic for nesting.

In src:ticket:466 I fixed the problems in distributed-memory but I ignored shared-memory and serial with partitions.

Also corrected and adjoint bug in ad_uv3dmix4_geo.h. I also updated the TLM, RPM, and ADM versions to include code for options VISC_3DCOEF and UV_U3ADV_SPLIT.

The adjoint version of these routines now passes the operator symmetry sanity checks for WC13.

#469 arango arango Fixed IMPORTANT: Corrected parallel bug in ini_fields.F and friends
Description
  • In src:ticket:178, Kate reported a shared-memory and serial with partition parallel bug in ini_fields.F when computing the vertically-integrated velocities (ubar, vbar). We needed to have the computation of Hz via set_depth in a different parallel region. In that ticket, I introduced a new routine ini_zeta which initialize Zt_avg1 to the initial conditions for zeta. The values of Zt_avg1 (time-averaged/filtered zeta) are the ones that actually used in set_depth.F. Well, all of this is fine but I forgot to move the call to set_depth from ini_fields.F. It is embarrassing... This needs to be done in a different parallel region in main3d.F:
    !
    !  Initialize free-surface and compute initial level thicknesses and
    !  depths.
    !
    !$OMP PARALLEL DO PRIVATE(thread,subs,tile) SHARED(ng,numthreads)
            DO thread=0,numthreads-1
              subs=NtileX(ng)*NtileE(ng)/numthreads
              DO tile=subs*thread,subs*(thread+1)-1,+1
                CALL ini_zeta (ng, TILE, iNLM)
                CALL set_depth (ng, TILE)
              END DO
            END DO
    !$OMP END PARALLEL DO
    !
    !  Initialize other state variables.
    !
    !$OMP PARALLEL DO PRIVATE(thread,subs,tile) SHARED(ng,numthreads)
            DO thread=0,numthreads-1
              subs=NtileX(ng)*NtileE(ng)/numthreads
              DO tile=subs*(thread+1)-1,subs*thread,-1
                CALL ini_fields (ng, TILE, iNLM)
              END DO
            END DO
    !$OMP END PARALLEL DO
    
  • The stratigraphic contributions to the bathymetry (h) are moved to ini_zeta since they are needed in set_depth when SEDIMENT and SED_MORPH are activated.
  • The routine ini_fields.F was cleaned and reorganized. This included the TLM, RPM, and ADM versions.
  • Corrected a distributed-memory I/O bug in periodic application for the radiation stress variables Sxx_bar, Sxy_bar, Syy_bar, Sxx, Sxy, Syy, Szx, and Szy. These are all ouput variables and not used in ROMS kernel. A mp_exchange2 and mp_exchange3 is needed for these variables.
  • By the way, the only way to apply periodic boundary conditions in a ROMS distributed-memory (MPI) application with more that one partition in the periodic direction is by a call to any of the mp_exchange routines since the array elements to be exchanged reside in a different node.
  • Re-organized routine radiation_stress.F to include new file nearshore_mellor05.h. It turns out the there are various formulations for radiation stresses in the literature. The formulation of Mellor (2005) has some fundamental problems which were addressed in Mellor (2008) reformulation. In the future, we plan to have other formulations available in ROMS. This is a very active topic in the literature nowadays. Be aware that the formulation currently distributed in ROMS is the one described by Mellor (2005) which has been replaced in Mellor (2008). This new formulation is still under testing and not available. John Warner is working on other formulations nowadays.
  • To facilitate future developments, the option NEARSHORE_MELLOR is renamed to NEARSHORE_MELLOR05.
  • The driver routine radiation_stress.F will be renamed in the future to a more generic name since it will include other types of wave forcing.
  • Corrected negative zero problem in extract_sta.F. Starting F95 zero values can be signed (-0 or +0) following the IEEE 754 floating point standard. This can be advantageous in some computations but not here when Ascl is negative and Apos is zero. This will produce different output files when comparing serial and distributed-memory applications (1x1 and any other tile partition combination are different). Since comparing serial and parallel output is essential for tracking parallel partition bugs, positive zero is enforced.
Batch Modify
Note: See TracBatchModify for help on using batch modify.
Note: See TracQuery for help on using queries.