Custom Query (964 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (658 - 660 of 964)

Ticket Owner Reporter Resolution Summary
#792 m.hadfield Fixed Invalid calls to load_i in read_couplepar.F
Description

When ROMS/SWAN coupling is enabled, compilation of ROMS/Utility/read_couplepar.F fails on several blocks of code like this one

            CASE ('Nthreads(ocean)')
              IF ((0.lt.Iocean).and.(Iocean.le.Nmodels)) THEN
                Npts=load_i(Nval, Rval, 1, Nthreads(Iocean))
              END IF

The fix is to rewrite them like this

            CASE ('Nthreads(ocean)')
              IF ((0.lt.Iocean).and.(Iocean.le.Nmodels)) THEN
                Npts=load_i(Nval, Rval, 1, Ivalue)
                Nthreads(Iocean)=Ivalue(1)
              END IF

A corrected version is attached.

#793 arango Fixed VERY IMPORTANT: Corrected several minor bugs
Description

Several minor bugs were corrected:

  • nesting.F: improved the random issues associated with the linear interpolation weights in put_refine2d and put_refine3d:
    !
    !  Set linear time interpolation weights. Fractional seconds are
    !  rounded to the nearest milliseconds integer towards zero in the
    !  time interpolation weights.
    !
          SecScale=1000.0_dp              ! seconds to milliseconds
    !
          Wold=ANINT((RollingTime(tnew,cr)-time(ng))*SecScale,dp)
          Wnew=ANINT((time(ng)-RollingTime(told,cr))*SecScale,dp)
          fac=1.0_dp/(Wold+Wnew)
          Wold=fac*Wold
          Wnew=fac*Wnew 
    
    Notice that the scaling from seconds to milliseconds does not affect the values of the weighting coefficients Wold and Wnew. They still range between 0 and 1 since we multiple and divide by SecScale. Many thanks to Takumu Iwamoto for bringing this to my attention.
  • metrics.F: Similar random issues are found when checking the grid refinement timestep when using the mod(dt(dg), dt(rg)):
    !
    !  Check refined grid time-step.  The time-step size for refined grids
    !  needs to be an exact mode multiple of its coarser donor grid.  In
    !  principle, it can be a "RefineScale" factor smaller.  However, other
    !  integer smaller or larger factor is allowed such that:
    !
    !    MOD(dt(dg)*SecScale, dt(rg)*SecScale) = 0   dg:  donor  coarse grid
    !                                                rg:  receiver fine grid
    !
    !  Notice that SecScale is used to avoid roundoff when the timestep
    !  between donor and receiver grids are small and less than one.
    !
            SecScale=1000.0_dp              ! seconds to milliseconds
            DO ig=1,Ngrids
              IF (RefinedGrid(ig).and.(RefineScale(ig).gt.0)) THEN
                dg=CoarserDonor(ig)
                IF (MOD(dt(dg)*SecScale,dt(ig)*SecScale).ne.0.0_dp) THEN
                  IF (DOMAIN(ng)%SouthWest_Test(tile)) THEN
                    IF (Master) THEN
                      WRITE (stdout,100) ig, dt(ig), dg, dt(dg),            &
         &                               MOD(dt(dg),dt(ig))
    
    The conditional MOD(dt(dg)*SecScale, dt(rg)*SecScale) must be zero to not trigger an error. Again, many thanks to Takumu Iwamoto for reporting this problem.
  • dateclock.F: Corrected typo in routine datenum when computing its values in seconds:
    !
    !  Fractional date number (units=second).
    !
          DateNumber(2)=REAL(MyDay,dp)*86400.0_dp+                          &
         &              REAL(MyHour,dp)*3600.0_dp+                          &
         &              REAL(MyMinutes,dp)*60.0_dp+                         &
         &              MySeconds
    
    
    We needed a factor of 3600 instead of 360. Many thanks to Rafael Soutelino for reporting this bug.
  • rpcg_lanczos.F: Corrected a parallel bug before calling cg_write_rpcg, which writes several parameters (Jf, Jdata, Jmod, Jopt, Jb, Jobs, Jact, preducv, preducy) that are only known by the master node. We need to broadcast their values to other nodes:
         CALL mp_bcastf (ng, model, Jf)
         CALL mp_bcastf (ng, model, Jdata)
         CALL mp_bcastf (ng, model, Jmod)
         CALL mp_bcastf (ng, model, Jopt)
         CALL mp_bcastf (ng, model, Jobs)
         CALL mp_bcastf (ng, model, Jact)
         CALL mp_bcastf (ng, model, preducv)
         CALL mp_bcastf (ng, model, preducy)
    
    since all the nodes participate in the writing. Many thanks to Andy Moore for bringing this to my attention.
  • mod_scalars.F: Added value for Jerlov water type 7:
            real(r8), dimension(9) :: lmd_mu1 =                             &
         &            (/ 0.35_r8, 0.6_r8,  1.0_r8,  1.5_r8, 1.4_r8,         
         &               0.42_r8, 0.37_r8, 0.33_r8, 0.00468592_r8 /)
    
    instead of a zero value. Many thnaks to Pierre St-Laurent for bringing this to my attention.
  • fennel.h: Added masing for wetting and drying. Many thanks to John Wilkin for reporting this issue.
  • mod_kinds.F: Included cppdef.h to have access to defined C-preprocessing options. Many thanks to Aaron Tsang for reporting this problem.
  • ana_psource.h: Improved condtional:
          IF (iic(ng).eq.ntstart(ng).or.(iic(ng).eq.0)) THEN
    
    so we can have anlitical initial conditions since iic(ng)=0. Many thanks to Jamie Pringle for reporting it.

Also, I updated set_ngfld.F, set_2dfld.F, set_3dfld.F, set_ngfldr.F, set_2dfldr.F, set_3dfldr.F for the correct double precision operations when SINGLE_PRECISION is activated. Recall that double precision is need for few variables fo guarantee accuracy.

#794 arango Done VERY IMPORTANT: Build scripts and standard input
Description

I wanted to rework the ROMS build scripts for a while to separate the user's customized library paths to a single file that can be included if so desired. Also, I needed some explicit renaming and cleaning because of the underway ESMF/NUOPC coupling.

RENAMING

To avoid ambiguity with other Earth System Model (ESM) components during coupling the following files were renamed:

  • The ROMS build scripts are renamed to build_roms.bash and build_roms.sh for the BASH and CSH shell environments, respectively.
  • The ROMS executable is renamed to romsS (serial), romsM (distributed-memory, MPI), romsO (shared-memory, OpenMP), and romsG (debugging). The ocean executable names were to generic and ambiguous if one wants to couple two different ocean models.
  • All the distributed ROMS standard input files are renamed to roms_APP.in where APP is the application or test case. For example, we have now roms_upwelling.in.
  • All the default filenames are renamed to roms_TYP.nc where TYP is the type of NetCDF file. Again, the ocean prefix was too generic and ambiguous. For example, we now have roms_avg.nc, roms_his.nc, roms_rst.nc, and so on. Notice that the user still has the choice to change the filename of ROMS input and output NetCDF files.

There are a lot of simple changes in this update with svn mv commands. The test cases repository was also completely updated.

CLEANING

  • The SeaIce subdirectory is moved to be inside of the Nonlinear sub-directory as it should be. It will facilitate in the future the writing of the tangent linear and adjoint transformations of such model in the Tangent, Representer, and Adjoint sub-directories. The movement of the SeaIce sub-directory will not affect the compiling and linking since ROMS makefile system is very flexible and powerful. The makefile was changed accordingly.
  • The Atmosphere sub-directory is removed from the repository. The new strategy for ESMF/NUOPC coupling is for the user to subscribe to such repositories and download and install the ESM component. ESM components are compiled and linked separated from ROMS. We need to use the libraries and sometimes few modules and object files to compile ROMS successfully in coupled applications. ROMS is driving the coupling.
  • All the user-customized library paths are now included, if so desired, in a script:
    • Compilers/my_build_paths.sh (CSH script) or
    • Compilers/my_build_paths.bash (BASH script)

Notice that we can specify an alternate directory in the build script. I keep my customized copies in ${HOME}/Compilers/ROMS. In build_roms.bash we have:

# Set path of the directory containing makefile configuration (*.mk) files.
# The user has the option to specify a customized version of these files
# in a different directory than the one distributed with the source code,
# ${MY_ROMS_SRC}/Compilers. If this is the case, you need to keep
# these configurations files up-to-date.

 export         COMPILERS=${MY_ROMS_SRC}/Compilers
#export         COMPILERS=${HOME}/Compilers/ROMS

...

#--------------------------------------------------------------------------
# If applicable, use my specified library paths.
#--------------------------------------------------------------------------

 export USE_MY_LIBS=no            # use system default library paths
#export USE_MY_LIBS=yes           # use my customized library paths

if [ "${USE_MY_LIBS}" = "yes" ]; then
  source ${COMPILERS}/my_build_paths.bash
fi

WHAT IS NEW

A new sub-directory ESM is added at the root containing scripts and modified ESM component files that substitute the ones distributed from source repositories to solve various technical issues during coupling (see ESM/Readme for details). The WRF files below were adapted from Version 4.0.3 (December 18, 2018).

  • coupling_esmf.in: Standard input script for ROMS when coupling with the ESMF/NUOPC library. It is well documented and sets the coupling system. To submit a job, we could use for example:
     mpirun -np 8 romsM coupling_esmf.in > & log &
    
  • coupling_esmf.dat: Coupling metadata defining import and export fields.
  • build_cice.sh: A friendlier CSH script to compile CICE.
  • build_wrf.bash, build_wrf.sh: BATCH and CSH compiling scripts for WRF to facilitate easy compiling and linking. It also corrects several technical issues with very old ESMF library interference and incorrect NetCDF4 library dependencies.
  • wrf_configure: Reworking of linking NetCDF4 library dependencies. It replaces ${WRF_ROOT_DIR}/configure.
  • wrf_add_configue: Adds CPP macros to rename ESMF and esmf to MYESMF and myesmf to the dependencies of module_domain.o and output_wrf.o to avoid conflicts with newer versions of the ESMF/NUOPC library. It appends to ${WRF_ROOT_DIR}/configure.wrf.
  • wrf_Makefile: Reworking of linking NetCDF4 library dependencies. Ir replaces ${WRF_ROOT_DIR}/Makefile.
  • wrf_postamble: Reworking of linking NetCDF4 library dependencies. It replaces ${WRF_ROOT_DIR}/arch/postamble.
  • wrf_configure.defaults: Substitutes obsolete compiling option -openmp to -qopenmp. It renames ocurrences of ESMF and esmf to MYESMF and myesmf to avoid conflicts with newer versions of the ESMF. WRF uses parts of an old version of the ESMF library for its internal time clock. It causes conflicts when coupling with newer ESMF/NUOPC library. Added mixed compiling with Intel/GNU (ifort/gcc) and OpenMPI. It replaces ${WRF_ROOT_DIR}/arch/configure.defaults.
  • wrf_Config.pl: It renames ocurrences of ESMF and esmf to MYESMF and myesmf to avoid conflicts with newer versions of the ESMF. It replaces ${WRF_ROOT_DIR}/arch/Config.pl
  • wrf_Makefile.esmf: It renames ocurrences of ESMF and esmf to MYESMF and myesmf to avoid conflicts with newer versions of the ESMF. Everything is done during C-preprocessing, so original files are not modified. It replaces ${WRF_ROOT_DIR}/external/esmf_time_f90/Makefile.
  • wrf_Test1.F90: It corrects bug in optional argument to ESMF_Initialize call from !defaultCalendar to 1defaultCalKind. It replaces ${WRF_ROOT_DIR}/external/esmf_time_f90/Test1.F90.

I will contact WRF developers to take care of these problems in future versions. However, we still this corrections or similar in an older or current version of WRF. I provide the full file so the user can compare the files to check the nature of the changes.

WARNING

The coupling via the ESMF/NUOPC is still now available for usage.

Batch Modify
Note: See TracBatchModify for help on using batch modify.
Note: See TracQuery for help on using queries.