Additional Exercises
Additional Exercises cindyhg Tue, 10/15/2019 - 11:13Additional Exerices
Additional exerices provided with the tutorial are:
- Running Idealized case
- Running Alternate Configuration
- Compiling WRFV3, WPS, UPP, and GSI
Idealized Run
Idealized Run cindyhg Tue, 10/15/2019 - 11:27HWRF Idealized
The default initial ambient base state assumes an f-plane at the latitude of 12.5◦. The sea surface temperature is time-invariant and horizontally homogeneous, with the default set to 302 K. No land is used in the simulation domain.Initial conditions for the HWRF idealized Tropical Cyclone case are specified using an idealized vortex superposed on a base state quiescent sounding. The default initial vortex has an intensity of 20 m/s and a radius of maximum winds of 90 km. To initialize the idealized vortex, a nonlinear balance equation in pressure-based sigma coordinates is solved within the rotated latitude-longitude E-grid framework.
The lateral boundary conditions used in the HWRF idealized simulation are the same as used in real data cases. This inevitably leads to some reflection when gravity waves emanating from the vortex reach the outer domain lateral boundaries.
The idealized simulation uses the operational HWRF triple-nested domain configuration with grid spacing at 18/6/2 km. All the operational atmospheric physics, as well as the supported experimental physics options in HWRF, can be used in the idealized HWRF framework. The UPP (see Chapter 10 of HWRF Users Guide) can be used to postprocess the idealized HWRF simulation output.
This exercise describes the process to implement HWRF v3.9a in the idealized setting. Only the WPS and WRFV3 components are required for the idealized tropical cyclone simulations. The UPP can be used for postprocessing. The other HWRF components do not need to be compiled. Please see how to compile WPS, WRF, and, if desired, UPP. However, in the interest of time the pre-compiled codes are available on Cheyenne. Note that the executable file wrf.exe needed for the idealized simulation is not the same as the one needed for the simulation for real data. Therefore, users should follow the instructions specific for building the idealized wrf.exe.
Input files
Two GFS GRIB files are needed to provide a template for creating the initial and lateral boundary conditions. One of the GFS GRIB files should be the analysis valid at the same time of the desired HWRF initialization. The other GRIB file should be a forecast, with lead time equal to or greater than the desired HWRF simulation. The meteorological data in these files will not be used to initialize the simulation – these files are for template purposes only.
The GFS files required in this exercise are available on NCAR's Cheyenne (/glade/p/ral/jnt/HWRF/datasets_v3.9a/Idealized/Data).
Other input files required for an idealized run are:
- namelist.wps Namelist file for WPS; Note that geog_data_path should be modified to point to the actual path of the geog data files (/glade/p/ral/jnt/HWRF/datasets_v3.9a/Matthew/fix/hwrf_wps_geo).
- namelist.input Namelist file for WRF
- input.d Vortex description file
- sound.d Sounding data; Additional four sounding files (sound.d, sound_gfdl.d, sound_jordan.d, and sound_wet.d) are provided in WRFV3/test/nmm_tropical_cyclone, however, only the one named sound.d will be used. In order to use a different sounding, rename it to sound.d.
- storm.center Vortex center file
- sigma.d Sigma file
- land.nml Namelist file containing land descriptions
Create workspace and copy source codes
Create and change into directory for running idealized.
mkdir -p /glade/scratch/${USER}/Idealized/wpsprd
mkdir -p /glade/scratch/${USER}/Idealized/wrfprd
setenv WORKDIR /glade/scratch/${USER}/Idealized
cd $WORKDIR
Untar the WRFV3 (compiled for idealized run) and WPS pre-compiled code.
tar -xzvf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WRFV3_Idealized.tar.gz
tar -xzvf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WPSV3.tar.gz
Run WPS to create the ICs and LBCs
Run geogrid
- Copy the WPS namelist in the wpsprd directory.
cd $WORKDIR/wpsprd
cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/namelist.wps ./ - Edit namelist.wps to make sure geog_data_path points to the location of the WPS geographical data files.
geog_data_path='/glade/p/ral/jnt/HWRF/datasets_v3.9a/Matthew/fix/hwrf_wps_geo' - Link the geogrid table.
ln -fs ${WORKDIR}/WPSV3/geogrid/GEOGRID.TBL.NMM ./GEOGRID.TBL - Run executable geogrid.exe on the command line or submit it to a compute node or batch system.
${WORKDIR}/WPSV3/geogrid.exe |& tee geogrid.log - Verify that the output files were created.
ls -l geo_nmm_nest.l0[12].nc geo_nmm.d01.nc
Run ungrib
- Link the ungrib table.
ln -fs ${WORKDIR}/WPSV3/ungrib/Variable_Tables/Vtable.GFS ./Vtable - Link the two input GFS files.
ln -fs /glade/p/ral/jnt/HWRF/datasets_v3.9a/Idealized/Data/gfs.t12z.pgrb2.0p25.f* ./
ls -l gfs.t12z.pgrb2.0p25.f000 gfs.t12z.pgrb2.0p25.f120 - Link the GFS files to the names expected by ungrib.
${WORKDIR}/WPSV3/link_grib.csh gfs.t12z.pgrb2.0p25.f000 gfs.t12z.pgrb2.0p25.f120 - Run executable ungrib.exe on the command line or submitting it to a compute node or batch system.
${WORKDIR}/WPSV3/ungrib.exe |& tee ungrib.log
- Verify that the output files were created.
ls -l GFS:2012-10-26_12 GFS:2012-10-31_12
Run mod_levs
- Run executable mod_levels.exe twice on the command line or submitting it to a compute node or batch system. This program will reduce the number of vertical levels in the GFS file. Only the levels listed in variable press_pa in namelist.wps will be retained.
${WORKDIR}/WPSV3/util/mod_levs.exe GFS:2012-10-26_12 new_GFS:2012-10-26_12
${WORKDIR}/WPSV3/util/mod_levs.exe GFS:2012-10-31_12 new_GFS:2012-10-31_12 - Verify that the output files were created.
ls -l new_GFS:2012-10-26_12 new_GFS:2012-10-31_12
Run metgrid
- Link the metgrid table.
ln -fs ${WORKDIR}/WPSV3/metgrid/METGRID.TBL.NMM ./METGRID.TBL - Run executable metgrid.exe on the command line or submitting it to a compute node or batch system.
${WORKDIR}/WPSV3/metgrid.exe |& tee metgrid.log
- Verify that the output files were created.
ls -l met_nmm.d01.2012-10-26_12_00_00.nc met_nmm.d01.2012-10-31_12_00_00.nc
Running ideal.exe and wrf.exe
The steps below outline the procedure to create initial and boundary conditions for the idealized simulation. It assumes that the run will be conducted in a working directory named $WORKDIR/wrfprd.
- Create and change into directory for running ideal and real.
- cd $WORKDIR/wrfprd
- Link WRF input files.
- ln -fs ${WORKDIR}/WRFV3/run/ETAMPNEW_DATA ./
ln -fs ${WORKDIR}/WRFV3/run/ETAMPNEW_DATA.expanded_rain ./
ln -fs ${WORKDIR}/WRFV3/run/GENPARM.TBL ./
ln -fs ${WORKDIR}/WRFV3/run/LANDUSE.TBL ./
ln -fs ${WORKDIR}/WRFV3/run/SOILPARM.TBL ./
ln -fs ${WORKDIR}/WRFV3/run/VEGPARM.TBL ./
ln -fs ${WORKDIR}/WRFV3/run/tr49t67 ./
ln -fs ${WORKDIR}/WRFV3/run/tr49t85 ./
ln -fs ${WORKDIR}/WRFV3/run/tr67t85 ./
ln -fs ${WORKDIR}/WRFV3/run/ozone.formatted ./
ln -fs ${WORKDIR}/WRFV3/run/ozone_lat.formatted ./
ln -fs ${WORKDIR}/WRFV3/run/ozone_plev.formatted ./
ln -fs ${WORKDIR}/WRFV3/run/RRTM_DATA ./
ln -fs ${WORKDIR}/WRFV3/run/RRTMG_LW_DATA ./
ln -fs ${WORKDIR}/WRFV3/run/RRTMG_SW_DATA ./
- ln -fs ${WORKDIR}/WRFV3/run/ETAMPNEW_DATA ./
- Link the WPS files.
- ln -fs $WORKDIR/wpsprd/met_nmm* ./
ln -fs $WORKDIR/wpsprd/geo_nmm* ./
- ln -fs $WORKDIR/wpsprd/met_nmm* ./
- Copy the idealized simulation input files.
- cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/input.d ./
cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/sigma.d ./
cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/sound.d ./
cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/storm.center ./
cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/land.nml ./
- cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/input.d ./
- Copy namelist input.
- cp ${WORKDIR}/WRFV3/test/nmm_tropical_cyclone/namelist.input ./
- Copy and edit the qsub template to the wrfprd directory.
- cp /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/qsub_Cheyenne_wrapper.csh run.ideal.csh
- Edit run.ideal.csh
#PBS -N ideal
#PBS -A NJNT0006
#PBS -l walltime=01:20:00
#PBS -q premium
#PBS -l select=4:ncpus=24:mpiprocs=24
#mpirun -np 96 ${WORKDIR}/WRFV3/main/ideal.exe
- Verify that the output files were created.
- ls -l wrfinput_d01 wrfbdy_d01 fort.65
In the interest of time and resources, the idealized HWRF simulation will be run for only 12 h.
Run WRF
- ls -l wrfinput_d01 wrfbdy_d01 fort.65
- Edit namelist.input to run the forecast for 12 h.
Change the- end_day = 27, 27, 27,
end_hour= 00, 00, 00,
accordingly.
- end_day = 27, 27, 27,
- Replace the history-output settings in namelist.input with the following:
- history_interval = 180, 180, 180,
auxhist1_interval = 60, 60, 60,
auxhist2_interval = 60, 60, 60,
auxhist3_interval = 180, 180, 180,
history_end = 540, 540, 540,
auxhist2_end = 540, 540, 540,
auxhist1_outname = "wrfdiag_d",
auxhist2_outname = "wrfout_d_",
auxhist3_outname = "wrfout_d_",
frames_per_outfile = 1, 1, 1,
frames_per_auxhist1 = 999, 999, 999,
frames_per_auxhist2 = 1, 1, 1,
frames_per_auxhist3 = 1, 1, 1,
history_end = 720, 720, 720,
auxhist2_end = 720, 720, 720,
- history_interval = 180, 180, 180,
- Copy and edit the qsub template to the wrfprd directory.
- cp /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/qsub_Cheyenne_wrapper.csh run.wrf_ideal.csh
- Edit run.wrf_ideal.csh
#PBS -N wrf_ideal
#PBS -A NJNT0006
#PBS -l walltime=00:20:00
#PBS -q premium
#PBS -l select=4:ncpus=24:mpiprocs=24
#mpirun -np 96 ${WORKDIR}/WRFV3/main/wrf.exe
- Finally, submit the jobs
- qsub run.ideal.csh
Note that executable wrf.exe must have been created using the instructions for idealized simulations described in Chapter 2 of HWRF Users Guide. The executable created for regular HWRF simulations that ingest real data should not be used to conduct idealized simulations.
- Verify that the output files were created.
- ls -l wrfout_d01* wrfout_d02* wrfout_d03*
Post processing and plotting of outputs
- Create a working directory
- mkdir -p ${WORKDIR}/postprd
cd ${WORKDIR}/postprd
- Copy the run_unipostandgrads script to the working directory. The exercise is designed assuming that the user had already copied the pre-compiled for the real case.
- cp /glade/p/ral/jnt/HWRF/datasets_v3.9a/scripts/run_unipostandgrads ./
cp ${SCRATCH/hwrfrun/sorc/UPP/scripts/cbar.gs ./
ln -fs ${SCRATCH}/hwrfrun/parm/post/hwrf_cntrl.nonsat wrf_cntrl.parm
- Edit the run_unipostandgrads script by making the following changes. [#] indicates the line number to modify.
- Set environment variables and directories. Most of these are already set for the tutorial.
[76] TOP_DIR=/glade/scratch/$USER/HWRF_v3.9a/hwrfrun/sorc/ --> Top of HWRF directory
[77] DOMAINPATH=/glade/scratch/$USER/Idealized --> Top of Idealized directory
[78] WRFPATH=/glade/scratch/$USER/HWRF_v3.9a/hwrfrun/sorc/WRFV3/ --> WRFV3 directory
[79] UNIPOST_HOME=${TOP_DIR}/UPP --> UPP directory - Define simulation timing.
[94] export startdate=2012102612 --> start date of simulation
[95] export fhr=00 --> start hour of simulation
[96] export lastfhr=12 --> end hour of simulation
[97] export incrementhr=03 --> increment of simulation - Edit the line to select the dynamic core.
[87] export dyncore="NMM" --> WRF dynamic core is NMM
- Edit the line to select the domain to process.
[100] export domain_list="d03" --> process d03 for this tutorial
- Select the lat-lon option for copygb option.
[114] export copygb_opt="lat-lon"
- Link appropriate files.
[242] ln -fs ${SCRATCH}/hwrfrun/parm/post/hwrf_cntrl.nonsat wrf_cntrl.parm --> Link to hwrf parm file
[263] ln -fs ${WRFPATH}/run/ETAMPNEW_DATA.expanded_rain eta_micro_lookup.dat --> Link to mp table - The wrfout files from HWRF using a different naming convention. Edit the line to correct it.
[341] wrfout_${domain}_${YY}-${MM}-${DD}_${HH}_00_00
- Edit plotting option to display of wind contours better.
[640] 'd ugrdprs;skip(vgrdprs,10)'
- Finally, save and close the run_unipostandgrads script.
- Set environment variables and directories. Most of these are already set for the tutorial.
- Run the script run_unipostandgrads on the command line.
- ./run_unipostandgrads |& tee post.log
- This script creates figures in the current directory. To display those images use display filename.gif.
- Because of the moving nests, the accumulated precipitation plots may not be ideal.
Alternate Configuration
Alternate Configuration cindyhg Tue, 10/15/2019 - 11:28Create user specific conf file
In this exercise, you will create a configuration of HWRF for a Western Pacific storm, CHABA 21W, initialized on October 4, 2016, at 00 UTC. For this case, you will run a coupled atmosphere-ocean forecast with no data assimilation using GRIB files as input for initial and boundary conditions.NOTE: Sections 3.7 and 3.8 of the HWRF Users Guide will be very helpful when running this exercise!
Each of the above pieces of information is configurable through variables available in either the configuration files in the parm/ directory or in the global_vars.ksh file in the wrappers/ directory.
To write your own configuration file, follow the directions below.
- Identify pieces of information that will be configured in global_vars.ksh, and edit that file.ANSWER
- Identify items related to work flow configuration. ANSWER
- Open a new file for editing in parm/ named hwrf_tutorial.conf
- Include sections, variables, and values for each of items identified above.ANSWER
Open a file for editing, named hwrf_tutorial.conf and include the following:
- [config]
run_gsi=no
run_ocean=yes
use_spectral=no
gfsinit_type=1[prelaunch]
basin_overrides=no
Add this config file to your launcher_wrapper script:
- ${HOMEhwrf}/scripts/exhwrf_launch.py "$YMDH" "$STID" "$CASE_ROOT" "$HOMEhwrf/parm" \
"config.EXPT=${EXPT}" "config.startfile=${startfile}" \
"config.HOMEhwrf=$HOMEhwrf" "config.case_root=$CASE_ROOT" \
"$HOMEhwrf/parm/hwrf_v3.9a_release.conf" \
"$HOMEhwrf/parm/hwrf_tutorial.conf" \
"$@"
Compiling WRFV3, WPS, UPP, and GSI
Compiling WRFV3, WPS, UPP, and GSI cindyhg Tue, 10/15/2019 - 11:31Compilation: Building the WRF, WPS, GSI, and UPP componements
Creating a working directory and extracting the sourcesThis exercise is intended to give you practice building the components WRF-NMM, WPS, and UPP, and GSI.
The source code is available in the tar files used previously, and is located on Cheyenne in the /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/ directory. The tar files you will need are:
HWRF_v3.9a_hwrfrun.tar.gz
|
HWRF_v3.9a_WRFV3.tar.gz
|
HWRF_v3.9a_WRFV3_Idealized.tar.gz
|
HWRF_v3.9a_WPSV3.tar.gz
|
HWRF_v3.9a_GSI.tar.gz
|
HWRF_v3.9a_UPP.tar.gz
|
Create and move into a working directory.
mkdir -p ${SCRATCH}/HWRF_building && cd ${SCRATCH}/HWRF_building
|
Untar the code under your working directory.
tar -zxf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_hwrfrun.tar.gz
|
cd ${SCRATCH}/HWRF_building/hwrfrun/sorc
|
tar -zxf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WRFV3.tar.gz
|
tar -zxf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WRFV3_Idealized.tar.gz
|
tar -zxf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WPSV3.tar.gz
|
tar -zxf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_GSI.tar.gz
|
tar -zxf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_UPP.tar.gz
|
Compile WRFV3
Compile WRFV3 cindyhg Tue, 10/15/2019 - 11:34Compilation: Compile WRFV3
Compile WRFV3 for real casesThe following exercise is intended to give you practice building the WRF-NMM. Compilation for both the real and idealized cases are provided below.
Set Environment Variables and Load Modules
Before building the WRF code, you must have a compiler and NetCDF modules loaded. Your environment has already been configured by editing your ~/.cshrc file. To check the loaded modules, type:
module list
Compilation of WRF is necessary prior to the compilation of WPS, UPP, and GSI.
Before getting started, create a directory and expand the code tarballs in it.
mkdir -p /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/
cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc
tar -xzvf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WRFV3.tar.gz
tar -xzvf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_WPSV3.tar.gz
tar -xzvf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_UPP.tar.gz
tar -xzvf /glade/p/ral/jnt/HWRF/HWRF_v3.9a_tut_codes/HWRF_v3.9a_GSI.tar.gz
To build WRF, start by moving into the WRF source code directory,
cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/WRFV3/
Since this component comes pre-compiled, start by conducting a full clean of the directory,
./clean -a
Before configuring, set the environmental variables:
setenv HWRF 1
setenv WRF_NMM_CORE 1
setenv WRF_NMM_NEST 1
setenv JASPERLIB
setenv PNETCDF_QUILT 1
setenv WRFIO_NCD_LARGE_FILE_SUPPORT 1
Configure WRF-NMM for HWRF
To configure WRF-NMM, go to the top of the WRF directory
cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/WRFV3
and type:
./configure
You will be presented with a list of build choices for your computer. For Cheyenne, choose option 15 for Intel with distributed memory (dmpar).
Compile WRF-NMM for HWRF
To build the WRF-NMM component, enter the command
./compile nmm_real |& tee compile_nmm.log
It takes about 30 minutes for the WRF-NMM compilation to complete.
Note: If the compilation is sucessful, the two executables, real_nmm.exe and wrf.exe, will be created in the WRFV3/main/ directory. If the compilation is not successful, check your environment settings and examine the compile_nmm.log file created.
To start over, clean the directories and return to the ./configure step described above. A complete clean is strongly recommended if the compilation failed, if the Registry has been changed, or if the configuration file is changed. For your reference, a full description of the WRF-NMM build process is availabe in chapter 2 of the HWRF Users' Guide.
To conduct a partial clean that retains the configuration file, and just removes the object files (except those in external/), type:
./clean
To conduct a complete clean which removes all built files in all directories, as well as the configure.wrf, type:
./clean -a
A successful compilation produces two executables in the directory main/:
real_nmm.exe
wrf.exe
Compile Idealized Tropical Cyclone WRF-NMM
The HWRF idealized tropical cyclone WRF-NMM component requires different executables than for the real case. The following section will describe how to build the executables for the idealized case.
Start by creating copying the WRF code into a directory for the WRF-Idealized compilation.
cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc
cp -r WRFV3 WRFV3-Idealized
cd WRFV3-Idealized
Building the idealized component requires a slightly different configuration than for the standard WRF build. Therefore, the idealized build is in a seperate directory. Beause this componenet is pre-compiled, a complete clean of the previous build must be performed:
./clean -a
This removes ALL build files, including the executables, libraries, and the configure.hwrf.
Set Environment Variables
Correct configuration of WRF-NMM for the HWRF idealized tropical cyclone simulation requires setting the additional environment variable IDEAL_NMM_TC. Be sure you have also loaded the compiler modules described previously for the WRF compile. Several other variables must also be set:
setenv WRF_NMM_CORE 1
setenv WRF_NMM_NEST 1
setenv HWRF 1
setenv IDEAL_NMM_TC 1
setenv WRFIO_NCD_LARGE_FILE_SUPPORT 1
Configure WRF-NMM for Idealized Tropical Cyclone
At the top of the WRF directory (/glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/WRFV3-Idealized/WRFV3), issue following the command:
./configure
You will be presented with a list of build choices for your computer. For Cheyenne, choose option 15.
Compiling WRF-NMM for Idealized Tropical Cyclone
Once the configure step is complete, the code is compiled by including the target nmm_tropical_cyclone to the compile command:
./compile nmm_tropical_cyclone
A successful compilation produces two executables in the directory main/:
ideal.exe WRF initialization
wrf.exe WRF model integration
Note: The only compilation requirements for the idealized capability are WPS and WRF. If wanted, UPP may also be used. The components MPIPOM-TC and coupler, GSI, GFDL vortex tracker, and hwrf-utilities are not used in HWRF idealized tropical cyclone simulations.
Compile WPS
Compile WPS cindyhg Tue, 10/15/2019 - 11:35Compile WPS
Configure WPS
Change to the WPS directory (cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/WPSV3), clean (remove pre-compile) and issue the configure command:
./clean -a
./configure
For Cheyenne, choose option 19 for dmpar with GRIB 2 support.
Note: To compile WPS with GRIB2 support, the JasPer, zlib and PNG libraries are needed.
Compile WPS
Issue the compile command:
./compile |& tee compile_wps.log
Note: If the compilation does not succeed in producing the executables, it will be necessary to conduct a clean on the WPSV3 directory. A complete clean is strongly recommended if the compilation failed or if the configuration file is changed.
To conduct a partial clean that retains the configuration file, and just removes the object files, type:
./clean
To conduct a complete clean which removes all built files in all directories, as well as the configure.wps, type:
./clean -a
After issuing the compile command, a listing of the current working directory should reveal symbolic links to executables for each of the three WPS programs:
geogrid.exe
ungrib.exe
metgrid.exe
Several symbolic links should also be in the util/ directory:
avg_tsfc.exe
calc_ecmwf_p.exe
g1print.exe
g2print.exe
height_ukmo.exe
int2nc.exe
mod_levs.exe (used for idealized runs)
rd_intermediate.exe
If any of these links do not exist, check the compilation output in the compile_wps.log file to see what went wrong.
Note: This compilation only takes a few minutes.
Compile GSI
Compile GSI cindyhg Tue, 10/15/2019 - 11:37Compile GSI
Set Environment Variables
To set up the environment for GSI, there are a couple environment variables not set in ~/.cshrc. These can be set by typing:
setenv HWRF 1
setenv WRF_DIR /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/WRFV3/
setenv LAPACK_PATH ${MKL}
Configure GSI
To build GSI for HWRF, change into the GSI/dtc directory (cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/GSI/dtc), clean (remove pre-compile) and issue the configure command.
./clean -a
./configure
Choose option 6 for Intel compiler with dmpar.
Compile GSI
After selecting the proper option, run the compile script:
./compile |& tee compile_gsi.log
Following the compile command, the GSI executable gsi.exe can be found in the dtc/run/ directory.
If the compilation does not succeed in producing the executable (GSI/dtc/run/gsi.exe), check your environment settings and examine the compile_gsi.log file created. It will be necessary to conduct a clean on the GSI/dtc directory. A complete clean (./clean -a) is strongly recommended if the compilation failed or if the configuration file is changed.
For your reference, a full description of the build process is availabe in chapter 2 of the HWRF Users' Guide.
Note: This compilation can take 15 minutes or more.
Compile UPP
Compile UPP cindyhg Tue, 10/15/2019 - 11:38Compile UPP
Set Environment Variables
Before compiling UPP, check that the HWRF environment variable is set:
echo $HWRF
If it is not set to 1, set it by typing the following: setenv HWRF 1
Next, set WRF_DIR to its absolute path:
setenv WRF_DIR /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/WRFV3/
Configure UPP
To build UPP for HWRF, change into the UPP directory (cd /glade/scratch/${USER}/HWRF_building/hwrfrun/sorc/UPP), clean (remove pre-compile) and issue the configure command.
./clean -a
To configure the UPP for compilation, type:
./configure
to generate the UPP configure file.
You will then be given a list of configuration choices tailored to your computer. Choose option 4.
Compile UPP
To compile UPP, enter the command:
./compile |& tee compile_upp.log
This command should create 13 UPP libraries in lib/ :
- libbacio.a
- libCRTM.a
- libg2.a
- libg2tmpl.a
- libgfsio.a
- libip.a
- libnemsio.a
- libsfcio.a
- libsigio.a
- libsp.a
- libw3emc.a
- libw3nco.a
- libxmlparse.a
and four UPP executables in bin/ :
- unipost.exe
- ndate.exe
- copygb.exe
- cnvgrib.exe
To remove all built files, as well as the configure.upp, type:
./clean -a
When the compilation is complete, a listing of the bin directory should reveal executables for each of the four UPP programs. If any of these executables do not exist, check the compilation output in the compile_upp.log file to see what went wrong.