How to force HWRF to use RTOFS input data staged on disk

Submitted by biswas on Thu, 08/27/2020 - 14:48

Issue:

I am having problem retrieving the rtofs inputs when hwrf_input.py extract files directly from HPSS.

The error message was 'rtofs_glo.t00z.n00.archv.a  does not exist". Reason being on HPSS, the filename to extract suppose to rtofs_glo.t00z.n00.archv.a.tgz. (Got this by directly listing the files in the tarball on hpss)

In the hwrf_input.py for the H220 checkout shows this

[jet_sources_PROD2019]
rtofs_disk%location = file:///lfs3/HFIP/hwrf-data/hwrf-input
rtofs_disk%histprio=60
rtofs_disk%fcstprio=70
rtofs_runhistory_backup%location=htar:/// ;; RTOFS operational run history
rtofs_runhistory_backup%histprio=80   ;; RTOFS Operational run 
history location
rtofs_runhistory%location=htar:/// ;; RTOFS operational run history
rtofs_runhistory%histprio=70   ;; RTOFS Operational run history location

Not sure why it did not go to the local disk to retrieve the file.

Solution:

The best approach in this situation is to force the scripts to use the existing data on the Jet disk. To get the scripts to do that, you should set the rtofsroot variable in the [hwrfdata] section of parm/hwrf_input.conf to:

<data_on_disk> [E.g. /mnt/lfs3/HFIP/hwrf-data/hwrf-input/rtofsforpom]