setting up HWRF Experiment (v4.0a) for IRMA test case

Submitted by singh.punit1990 on Wed, 09/29/2021 - 11:36
Forum: Users | General

Hi,
I am trying to get hands on with the HRWF to benchmark few new systems.
I was able to compile the HWRF components using instructions in ( Chapter 2) - https://dtcenter.org/sites/default/files/community-code/hwrf/docs/users_guide/HWRF-UG-2018.pdf . I am currently stuck at the experiment setup .

here's a breif rundown on the setup i have - 

The /home/user1/HWRF_AMD_icc2019u5/v4.0A/src/runs contains - 
abdecks  
enkf.2017090506  
fix  
gdas1.2017090506  
gfs.2017090512  
loop  
recon.2017090518  
SYNDAT-PLUS  
TEMPDROP

/home/user1/HWRF_AMD_icc2019u5/v4.0A/src/HWRF_v4.0a_tut_codes/hwrfrun/ contains - 
doc  
exec  
modulefiles  
nwport  
parm  
scripts  
sorc  
ush  
wrappers

s /home/user1/HWRF_AMD_icc2019u5/v4.0A/src/HWRF_v4.0a_tut_codes/hwrfrun/sorc/ contains - 
gfdl-vortextracker  GSI  hwrf-utilities  ncep-coupler  pomtc  UPP  WPS  WRF

where WRF was compiled with instructions mentioned in 2.7.1 and 2.7.2 with dmpar setting . DO i need to recompile WRF for idealized case (2.7.3)  in order to use the executables with IRMA test case https://dtcenter.org/community-code/hurricane-wrf-hwrf/datasets?

and  /home/user1/HWRF_AMD_icc2019u5/v4.0A/DEPS/v1/ contains dependencies like - 
housed hdf,netcdf,png

----------------
I have few queries from the EXPERIMENT CONFIGURATION page - 
Q1:  https://dtcenter.org/hwrf-online-tutorial-v3-9a/experiment-configuration -  mentions 4 conf files - 
hwrf.conf, hwrf_basic.conf, hwrf_input.conf, system.conf

do i need to make any modifications in any of those 4 files to run the IRMA test case?

Q2: The three environment variables - 
HOMEhwrf = /glade/scratch/{ENV[USER]}/HWRF_v3.9a/hwrfrun
WORKhwrf = /glade/scratch/{ENV[USER]}/pytmp/hwrfrun/2016100400/14L
COMIN = /glade/scratch/{ENV[USER]}/pytmp/hwrfrun/com/2016100400/14L

I am able to set HOMEwrf variable as - 
HOMEhwrf = /home/user1/HWRF_AMD_icc2019u5/v4.0A/src/HWRF_v4.0a_tut_codes/hwrfrun/
But i am not sure that to which path these 2 variables should point to ?
WORKhwrf = 
COMIN = 

Q3: On my setup, i have 20 nodes, each node with 128 cores and they all are part of "regular" queue ( slurm)
I plan to create /home/user1/HWRF_AMD_icc2019u5/v4.0A/src/HWRF_v4.0a_tut_codes/hwrfrun/parm/system.conf based on /home/user1/HWRF_AMD_icc2019u5/v4.0A/src/HWRF_v4.0a_tut_codes/hwrfrun/parm/system.conf.cheyenne with only following edits  - 

[config]
fcst_catalog=comm_hist
archive=none
publicrelease=yes
run_ensemble_da=no
scrub=no
forecast_length=12
[dir]
inputroot=/home/user1/HWRF_AMD_icc2019u5/v4.0A/src/runs
syndat=/home/user1/HWRF_AMD_icc2019u5/v4.0A/src/runs/SYNDAT-PLUS
outputroot=/home/user1/HWRF_AMD_icc2019u5/v4.0A/src/runs/output1
CDNOSCRUB={outputroot}/noscrub
CDSCRUB={outputroot}/pytmp
CDSAVE=/home/user1/HWRF_AMD_icc2019u5/v4.0A/src/HWRF_v4.0a_tut_codes

[comm_hist]
inputroot=/home/user1/HWRF_AMD_icc2019u5/v4.0A/src/runs

this setting setting looks fine  or there are some variables which (are unnecessary and) can be omittted? -

please advice.