UFS Short-Range Weather (SRW) Practical Session Guide | Session 1 > 5. Generate the Plots

The plotting scripts are under the directory  $SR_WX_APP_TOP_DIR/regional_workflow/ush/Python

Change to the script directory

cd $SR_WX_APP_TOP_DIR/regional_workflow/ush/Python

Load the appropriate python environment for Cheyenne, 

module purge
module load ncarenv
ncar_pylib /glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/python_graphics

If you see something like:

Warning: library /glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/regional_workflow is missing from NPL clone registry.
Do you wish to add it to the registry (y/N)?

This will occur if this is the first time you have used the NCAR Python Library (NPL), type y, to get the following output:

Searching library for configuration info ...
Now using NPL virtual environment at path:
/glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/regional_workflow
Use deactivate to remove NPL from environment

Run the python plotting script (Estimated time: 5 minutes)

python plot_allvars.py 2019061500 6 12 6  /glade/scratch/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 /glade/p/ral/jntp/UFS_SRW_app/tools/NaturalEarth

Here plot_allvar.py is the plotting script, and six command line arguments are:

  1. 2019061500 is cycle date/time in YYYYMMDDHH format, 
  2. 6 is the starting forecast hour
  3. 12 is the ending forecast hour
  4. 6 is the forecast hour increment
  5. The top level of the experiment directory is /glade/scratch/$USER/expt_dirs/test_CONUS_25km_GFSv15p2
  6. /glade/p/ral/jntp/UFS_SRW_app/tools/NaturalEarth is the base directory of the cartopy shapefiles 

Display the output figures

The output files (in .png format) can be found in the directory /glade/scratch/$USER/expt_dirs/test_CONUS_25km_GFSv15p2/2019061500/postprd.

The png files can be found using ls command after changing to that directory: 

cd /glade/scratch/$USER/expt_dirs/test_CONUS_25km_GFSv15p2/2019061500/postprd
ls *.png
10mwind_conus_f006.png  2mdew_conus_f012.png  qpf_conus_f006.png     sfcape_conus_f012.png
10mwind_conus_f012.png  2mt_conus_f006.png    qpf_conus_f012.png     slp_conus_f006.png
250wind_conus_f006.png  2mt_conus_f012.png    refc_conus_f006.png    slp_conus_f012.png
250wind_conus_f012.png  500_conus_f006.png    refc_conus_f012.png    uh25_conus_f006.png
2mdew_conus_f006.png    500_conus_f012.png    sfcape_conus_f006.png  uh25_conus_f012.png

The png file can be displayed by the command display:

display 10mwind_conus_f006.png

Here is an example plot of 10m surface wind at F006:

 

Running with batch script

(Estimated time: 25 min wall time for 48 forecast hours, every 3 hours)

There are some batch scripts under the same directory with the python scripts. If the batch script qsub_job.sh is being used to run the plotting script, multiple environment variables need to be set prior to submitting the script besides the python environment:

For bash:

export HOMErrfs=$SR_WX_APP_TOP_DIR/regional_workflow
export EXPTDIR=/glade/scratch/$USER/expt_dirs/test_CONUS_25km_GFSv15p2

For csh/tcsh:

setenv HOMErrfs $SR_WX_APP_TOP_DIR/regional_workflow
setenv EXPTDIR /glade/scratch/$USER/expt_dirs/test_CONUS_25km_GFSv15p2

The HOMErrfs gives the location of regional workflow and EXPTDIR points to the experiment directory (test_CONUS_25km_GFSv15p2). In addition, the account to run the batch job in qsub_job.sh needs to be set.  You might wish to change the FCST_INC in the batch script for different plotting frequency, and increase the walltime to 25 minutes (for 48 forecast hours, every 3 hours).

You will need to change an_account in qsub_job.sh to a chargable account: NJNT0008.

Then the batch job can be submitted by the following command:

cd $HOMErrfs/ush/Python
qsub qsub_job.sh

The running log will be in the file $HOMErrfs/ush/Python/plot_allvars.out when the job completes.

The output files (in .png format) will be located in the same directory as the command line run: $EXPTDIR/2019061500/postprd