Session 1: METplus Setup/Grid-to-Grid

Session 1: METplus Setup/Grid-to-Grid

METplus Practical Session 1

During the first METplus practical session, you will run the tools indicated below:

During this practical session, please work on the Session 1 exercises. Proceed through the tutorial exercises by following the navigation links at the bottom of each page.

Tutorial Format

Throughout this tutorial, code blocks in BOLD white text with a black background should be copied from your browser and pasted on the command line, e.g.:

echo "Let's Get Started"
Text in YELLOW boxes contains important information, expert hints or helpful links. Please read carefully.
Text in BLUE boxes are instructions for the user to perform some action or edit (add or modify) a specific file on your system.
Text in GRAY boxes are sample output from a command or contents of a file.

Tutorial Tips

Please read the instructions carefully! In some cases there are two sets of instructions where only one set of copyable instructions should be executed (i.e. bash vs. csh). Ignoring the information and simply copy/pasting the command line instructions may result in unintended consequences.

Note: Instructions in this tutorial use vi to open and edit files. If you prefer to use a different file editor, feel free to substitute it whenever you see vi.

Note: Instructions in this tutorial use okular to view pdf, ps, and png files. If you prefer to use a different file viewer, feel free to substitute it whenever you see okular.

Note: If you are running the tutorial inside Docker, you will not have access to the visualization tools described in this tutorial (such as okular, ncview, etc.) inside the Docker container. To run these commands, you will have to mount the output directory inside Docker to your local computer file system and run these tools from there.

Click the 'METplus Setup' link below to get started!
If you discover any typos, error in the run commands, incorrect output listed, or any other issues while completing the tutorial, you are encouraged to submit your findings to the METplus team in a GitHub Discussions. Be sure to provide what session and specific page you encountered the issue on.
admin Wed, 06/12/2019 - 16:55

METplus Setup

METplus Setup

METplus Overview

METplus is a set of Python modules that have been developed with the flexibility to run the MET applications for various use cases or scenarios. The goal is to simplify the running of MET for scientists. Currently, the primary means of achieving this is through the use of METplus configuration files, aka "conf files." It is designed to provide a framework in which additional use cases can be added. The conf file implementation utilizes a Python package called produtil that was developed by NOAA/NCEP/EMC for the HWRF system.

Please be sure to follow the instructions in order.

METplus Useful Links

The following links are just for reference, and not required for this practical session. METplus releases are available on GitHub along with sample data and instructions.

METplus User's Guide
METplus Releases on GitHub

The source code for the METplus components are publicly available in the following GitHub repositories:

New features are developed and bugs are tracked using GitHub issues in each repository.

admin Mon, 06/24/2019 - 15:58

METplus: Initial setup

METplus: Initial setup

Prerequisites: Software

The Requirements section in the Software Installation chapter of the METplus User's Guide lists the software and Python packages that are required to run the METplus wrappers. Note that there is a core set of requirements needed to run the METplus wrappers and additional requirements needed to utilize some of the more advanced features.

 

Prerequisites: Environment

This tutorial assumes MET 11.0.1 and METplus 5.0.1 have been installed on the machine used to run the exercises and that the sample input data is also available. If this is not the case, please navigate to the Software Installation chapter of the METplus User's Guide.

The following instructions are required so the commands in this tutorial can be copied and run without modification. The steps involve creating a working directory to store files used/generated by the tutorial and configuring a simple script that can be run to set up the shell environment. Once configured correctly, the script can be run upon returning to the tutorial content to easily resume progress.

Click on the link that corresponds to the environment you are setting up.
If you are running the tutorial instructions on your own computer, select the bash or csh instructions depending on which shell you prefer. We recommend using bash if you do not have a preference.

Pre-Configured Environments

Setting up the Tutorial Environment on Hera (NOAA)

Setting up the Tutorial Environment on Jet (NOAA)

Setting up the Tutorial Environment on Cheyenne (NCAR)

Setting up the Tutorial Environment on Seneca (NCAR)

User Configured Environments

Setting up the Tutorial Environment (bash)

Setting up the Tutorial Environment (csh)

 

admin Mon, 06/24/2019 - 15:59

Setting up the Tutorial Environment on Cheyenne (NCAR)

Setting up the Tutorial Environment on Cheyenne (NCAR)

Setting up the Tutorial Environment on Cheyenne (NCAR)

The following instructions should be run if configuring the shell environment to run the METplus Tutorial on Cheyenne (NCAR). If you are running on your own computer or a NOAA machine that has been set up to run the tutorial, please go back and click the appropriate link for those instructions.

Create a Working Directory

Create a directory called 'METplus-5.0.0_Tutorial' to hold all of the files you will create during the tutorial. This can be any directory that you have write permission.

In the following instructions, change "/path/to" to the directory you chose: EDIT AFTER COPYING and BEFORE HITTING RETURN!
Change directory to the location where you want to put your tutorial files.
cd /path/to
Create a directory called METplus-5.0.0_Tutorial.
This directory will contain all of your tutorial work, including configuration files, output data, and any other notes you'd like to keep.
mkdir METplus-5.0.0_Tutorial
Change directory into the tutorial directory.
cd METplus-5.0.0_Tutorial

Create Directories for Configuration Files and Output Data

mkdir user_config
mkdir output

Obtain the Tutorial Setup Script

Copy the tutorial setup shell script to configure your runtime environment for use with METplus. You will also copy over a METplus .conf file that you will use to run:
cp /glade/p/ral/jntp/MET/METplus/Tutorial_Files/METplus-5.0.0_Tutorial_Files/tutorial-5.0.0.conf ./tutorial.conf
Run the command that corresponds your default shell.
DO NOT RUN BOTH SETS OF COMMANDS.
If you are unsure which shell you are using, run the command "echo $SHELL" to check.
If you are using bash:
cp /glade/p/ral/jntp/MET/METplus/Tutorial_Files/METplus-5.0.0_Tutorial_Files/METplus-5.0.0_TutorialSetup.cheyenne.bash ./METplus-5.0.0_TutorialSetup.sh
If you are using csh:
cp /glade/p/ral/jntp/MET/METplus/Tutorial_Files/METplus-5.0.0_Tutorial_Files/METplus-5.0.0_TutorialSetup.cheyenne.csh ./METplus-5.0.0_TutorialSetup.sh
Next navigate to the Verify Environment is Set Correctly page.
mccabe Thu, 07/29/2021 - 09:50

Setting up the Tutorial Environment on Jet (NOAA)

Setting up the Tutorial Environment on Jet (NOAA)

Setting up the Tutorial Environment on Jet (NOAA)

The following instructions should be run if configuring the shell environment to run the METplus Tutorial on Jet (NOAA). If you are running on your own computer or an NCAR machine that has been set up to run the tutorial, please go back and click the appropriate link for those instructions.

Create a Working Directory

Create a directory called 'METplus-5.0.0_Tutorial' to hold all of the files you will create during the tutorial. This can be any directory that you have write permission.

In the following instructions, change "/path/to" to the directory you chose: EDIT AFTER COPYING and BEFORE HITTING RETURN!
Change directory to the location where you want to put your tutorial files.
cd /path/to
Create a directory called METplus-5.0.0_Tutorial. This directory will contain all of your tutorial work, including configuration files, output data, and any other notes you'd like to keep.
mkdir METplus-5.0.0_Tutorial
Change directory into the tutorial directory.
cd METplus-5.0.0_Tutorial

Create Directories for Configuration Files and Output Data

mkdir user_config
mkdir output

Obtain the Tutorial Setup Script

Copy the tutorial setup shell script to configure your runtime environment for use with METplus. You will also copy over a METplus .conf file that you will use to run:
cp /lfs1/HFIP/dtc-hurr/METplus/METplus-5.0.0_Tutorial_Files/METplus-5.0.0_TutorialSetup.jet.`basename $SHELL` ./METplus-5.0.0_TutorialSetup.sh
cp /lfs1/HFIP/dtc-hurr/METplus/METplus-5.0.0_Tutorial_Files/tutorial-5.0.0.conf ./tutorial.conf

 

Next navigate to the Verify Environment is Set Correctly page.
mccabe Wed, 11/17/2021 - 13:07

Setting up the Tutorial Environment on Seneca (NCAR)

Setting up the Tutorial Environment on Seneca (NCAR)

Setting up the Tutorial Environment on Seneca (NCAR)

The following instructions should be run if configuring the shell environment to run the METplus Tutorial on Seneca (NCAR). If you are running on your own computer or a NOAA machine that has been set up to run the tutorial, please go back and click the appropriate link for those instructions.

Create a Working Directory

Create a directory called 'METplus-5.0.0_Tutorial' to hold all of the files you will create during the tutorial. This can be any directory that you have write permission.

In the following instructions, change "/path/to" to the directory you chose: EDIT AFTER COPYING and BEFORE HITTING RETURN!
Change directory to the location where you want to put your tutorial files.
cd /path/to
Create a directory called METplus-5.0.0_Tutorial.
This directory will contain all of your tutorial work, including configuration files, output data, and any other notes you'd like to keep.
mkdir METplus-5.0.0_Tutorial
Change directory into the tutorial directory.
cd METplus-5.0.0_Tutorial

Create Directories for Configuration Files and Output Data

mkdir user_config
mkdir output

Obtain the Tutorial Setup Script

Copy the tutorial setup shell script to configure your runtime environment for use with METplus. You will also copy over a METplus .conf file that you will use to run:
cp /d1/projects/METplus/METplus-5.0.0_Tutorial_Files/tutorial-5.0.0.conf ./tutorial.conf
cp /d1/projects/METplus/METplus-5.0.0_Tutorial_Files/METplus-5.0.0_TutorialSetup.seneca.`basename $SHELL` ./METplus-5.0.0_TutorialSetup.sh
 

(OPTIONAL) Set up conda Environment for METplus Analysis and s2s Tutorial Sections

Due to the need for external package dependencies for the METplus Analysis and s2s sections, the following instructions are also necessary. If you do not plan on executing the commands of those sections, you can skip these steps.

First, ensure that you are running the bash environment:

bash

Now, verify that you have a .condarc file in your /home/user directory. If the file does not exist, create one. In that file, you will need the following entries, in the exact same format (dashes and colons):

channels:
- defaults
- conda-forge
- anaconda

Verify that you are using the most recent version of anaconda3 in your .bashrc file. Verify that your conda_setup in your .bashrc file is using /usr/local/anaconda3/bin/conda. You should see a code block similar to following:

# >>> conda initialize >>>
# !! Contents within this block are managed by 'conda init' !!
__conda_setup="$('/usr/local/anaconda3/bin/conda' 'shell.bash' 'hook'2> /dev/null)"
if [ $? -eq 0 ]; then
eval "$__conda_setup"
else
if [ -f "/usr/local/anaconda3/etc/profile.d/conda.sh" ]; then
. "/usr/local/anaconda3/etc/profile.d/conda.sh"
else
export PATH="/usr/local/anaconda3/bin:$PATH"
fi
fi
unset __conda_setup
# <<< conda initialize <<<

If you do have to make any changes to your .bashrc file, it would be best at this point to sign out, then sign back into your Seneca connection from the terminal. Errors have been found to happen even after sourcing your .bashrc file for the updated content.

From the command line, enter the following:

/usr/local/anaconda3/bin/conda create -n metplus_analysis_tutorial_env python=3.8
mamba -y

If this was successful, you will see the following at the end of the output streamed to the screen:

Downloading and Extracting Packages
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
# $ conda activate metplus_analysis_tutorial_env
#
# To deactivate an active environment, use
#
# $ conda deactivate

This creates a basic Python 3.8 conda environment named ‘metplus_analysis_tutorial_env’ that contains mamba (to accelerate installing the remaining Python packages).

Activate this environment by running the following from the command line:

conda activate metplus_analysis_tutorial_env

Install the following packages to the conda environment by running the following from the command line (using mamba to speed up the installation):

mamba install imageio=2.19.3 imutils=0.5.4 matplotlib=3.5.2 metpy=1.3.1 netcdf4=1.6.2 numpy=1.24.2 pandas=1.5.1 pint=0.19.2 plotly=5.9.0 pytest=7.2.1 python-kaleido=0.2.1 pyyaml=6.0 scikit-image=0.19.3 scipy=1.8.1 lxml=4.9.1 eofs=1.4.0 scikit-learn=1.2.0 -y

You will see many lines of information streaming to your screen. You will see following output to your screen if everything was successful:

Preparing transaction: done
Verifying transaction: done
Executing transaction: /
Installed package of scikit-learn can be accelerated using scikit-learn-intelex.
More details are available here: https://intel.github.io/scikit-learn-intelex

For example:
$ conda install scikit-learn-intelex
$ python -m sklearnex my_application.py
done

Now you are ready to run the METplus Analysis tutorials.

Next navigate to the Verify Environment is Set Correctly page.
mccabe Wed, 06/30/2021 - 12:24

Setting up the Tutorial Environment on Hera (NOAA)

Setting up the Tutorial Environment on Hera (NOAA)

Setting up the Tutorial Environment on Hera (NOAA)

The following instructions should be run if configuring the shell environment to run the METplus Tutorial on Hera (NOAA). If you are running on your own computer or an NCAR machine that has been set up to run the tutorial, please go back and click the appropriate link for those instructions.

Create a Working Directory

Create a directory called 'METplus-5.0.0_Tutorial' to hold all of the files you will create during the tutorial. This can be any directory that you have write permission.

In the following instructions, change "/path/to" to the directory you chose: EDIT AFTER COPYING and BEFORE HITTING RETURN!
Change directory to the location where you want to put your tutorial files.
cd /path/to
Create a directory called METplus-5.0.0_Tutorial. This directory will contain all of your tutorial work, including configuration files, output data, and any other notes you'd like to keep.
mkdir METplus-5.0.0_Tutorial
Change directory into the tutorial directory.
cd METplus-5.0.0_Tutorial

Create Directories for Configuration Files and Output Data

mkdir user_config
mkdir output

Obtain the Tutorial Setup Script

Copy the tutorial setup shell script to configure your runtime environment for use with METplus. You will also copy over a METplus .conf file that you will use to run:
cp /scratch1/BMC/dtc/METplus/METplus-5.0.0_Tutorial_Files/METplus-5.0.0_TutorialSetup.hera.`basename $SHELL` ./METplus-5.0.0_TutorialSetup.sh
cp /scratch1/BMC/dtc/METplus/METplus-5.0.0_Tutorial_Files/tutorial-5.0.0.conf ./tutorial.conf

 

Next navigate to the Verify Environment is Set Correctly page.
mccabe Wed, 06/30/2021 - 12:22

Setting up the Tutorial Environment (bash)

Setting up the Tutorial Environment (bash)

Setting up the Tutorial Environment (bash)

The following instructions should be run if configuring the bash environment to run the METplus Tutorial on your own computer. If you are running on an NCAR or NOAA machine that has been set up to run the tutorial, please go back and click the appropriate link for those instructions.

Create a Working Directory

Create a directory called 'METplus-5.0.0_Tutorial' to hold all of the files you will create during the tutorial. This can be any directory that you have write permission.

In the following instructions, change "/path/to" to the directory you chose: EDIT AFTER COPYING and BEFORE HITTING RETURN!
Change directory to the location where you want to put your tutorial files.
cd /path/to
Create a directory called METplus-5.0.0_Tutorial.
This directory will contain all of your tutorial work, including configuration files, output data, and any other notes you'd like to keep.
mkdir METplus-5.0.0_Tutorial
Change directory into the tutorial directory.
cd METplus-5.0.0_Tutorial

Create Directories for Configuration Files and Output Data

mkdir user_config
mkdir output

Obtain the Tutorial Setup Script

Copy the tutorial setup shell script to configure your runtime environment for use with METplus. You will also copy over a METplus .conf file that you will use to run:
wget https://dtcenter.org/sites/default/files/community-code/metplus/tutorial-data/METplus-5.0.0_TutorialSetup.bash.sh.txt -O ./METplus-5.0.0_TutorialSetup.sh
wget https://dtcenter.org/sites/default/files/community-code/metplus/tutorial-data/tutorial-5.0.0.conf -O ./tutorial.conf

Configure the Tutorial Setup Script

Open the tutorial setup script with the editor of your choice and modify the values for METPLUS_BUILD_BASE, MET_BUILD_BASE, and METPLUS_DATA.
vi METplus-5.0.0_TutorialSetup.sh
METPLUS_BUILD_BASE is the full path to the METplus installation (/path/to/METplus-X.Y)
MET_BUILD_BASE is the full path to the MET installation (/path/to/met-X.Y)
METPLUS_DATA is the location of the sample test data directory

 

Next navigate to the Verify Environment is Set Correctly page.
mccabe Wed, 06/30/2021 - 12:06

Setting up the Tutorial Environment (csh)

Setting up the Tutorial Environment (csh)

Setting up the Tutorial Environment (csh)

The following instructions should be run if configuring a csh environment to run the METplus Tutorial on your own computer. If you are running on an NCAR or NOAA machine that has been set up to run the tutorial, please go back and click the appropriate link for those instructions.

Create a Working Directory

Create a directory called 'METplus-5.0.0_Tutorial' to hold all of the files you will create during the tutorial. This can be any directory that you have write permission.

In the following instructions, change "/path/to" to the directory you chose: EDIT AFTER COPYING and BEFORE HITTING RETURN!
Change directory to the location where you want to put your tutorial files.
cd /path/to
Create a directory called METplus-5.0.0_Tutorial.
This directory will contain all of your tutorial work, including configuration files, output data, and any other notes you'd like to keep.
mkdir METplus-5.0.0_Tutorial
Change directory into the tutorial directory.
cd METplus-5.0.0_Tutorial

Create Directories for Configuration Files and Output Data

mkdir user_config
mkdir output

Obtain the Tutorial Setup Script

Copy the tutorial setup shell script to configure your runtime environment for use with METplus. You will also copy over a METplus .conf file that you will use to run:
wget https://dtcenter.org/sites/default/files/community-code/metplus/tutorial-data/METplus-5.0.0_TutorialSetup.csh.sh.txt -O ./METplus-5.0.0_TutorialSetup.sh
wget https://dtcenter.org/sites/default/files/community-code/metplus/tutorial-data/tutorial-5.0.0.conf -O ./tutorial.conf

Configure the Tutorial Setup Script

Open the tutorial setup script with the editor of your choice and modify the values for METPLUS_BUILD_BASE, MET_BUILD_BASE, and METPLUS_DATA.
vi METplus-5.0.0_TutorialSetup.sh
METPLUS_BUILD_BASE is the full path to the METplus installation (/path/to/METplus-X.Y)
MET_BUILD_BASE is the full path to the MET installation (/path/to/met-X.Y)
METPLUS_DATA is the location of the sample test data directory

 

Next navigate to the Verify Environment is Set Correctly page.
mccabe Wed, 06/30/2021 - 12:21

Verify Environment is Set Correctly

Verify Environment is Set Correctly

Verify Environment is Set Correctly

Run the Tutorial Setup Script

Navigate to your METplus tutorial directory and source the environment file to apply the settings to the current shell. Each time you log in, you will have to source this file again.
In the following instructions, change "/path/to" to the path to your tutorial directory. EDIT AFTER COPYING and BEFORE HITTING RETURN!
cd /path/to/METplus-5.0.0_Tutorial
source METplus-5.0.0_TutorialSetup.sh

The tutorial setup script sets the paths for METPLUS_TUTORIAL_DIR, METPLUS_BUILD_BASE, MET_BUILD_BASE, and METPLUS_DATA. It also appends the $PATH environment variable to include the directory where the METplus scripts are located. If necessary, it may also load modules needed for the METplus software to run correctly.

Check Path

Make sure that all of the environment variables are set to the appropriate values and that the path is set up to locate the METplus components.

Run the 'which run_metplus.py' command.
If you did everything correctly, the full path displayed should be the script in the shared location, ${METPLUS_BUILD_BASE}:
which run_metplus.py
Run the 'point_stat' command without any arguments.
This should display the usage information for the application.
point_stat

You should see the usage statement for Point-Stat. The version number listed should correspond to the version listed in MET_BUILD_BASE. If it does not, you will need to either reload the met module, or add ${MET_BUILD_BASE}/bin to your PATH.

Check that the environment variables required to run the tutorial instructions are set correctly.

$METPLUS_TUTORIAL_DIR

The directory you created to store all of your tutorial files

echo ${METPLUS_TUTORIAL_DIR}

Example value:

/home/metplus_user/METplus_Tutorial
ls ${METPLUS_TUTORIAL_DIR} -1

Example contents:

METplus-5.0.0_TutorialSetup.sh
output/
tutorial.conf
user_config/

$MET_BUILD_BASE

The directory where MET is installed

echo ${MET_BUILD_BASE}

Example value:

/home/metplus_user/met-11.0.1
ls ${MET_BUILD_BASE} -1

Example contents

bin
share
Note: there may be more files/directories than shown here

Check contents of MET bin directory

ls ${MET_BUILD_BASE}/bin -1

Example contents:

ascii2nc
ensemble_stat
gen_ens_prod
gen_vx_mask
gis_dump_dbf
gis_dump_shp
gis_dump_shx
grid_diag
grid_stat
gsid2mpr
gsidens2orank
ioda2nc
lidar2nc
madis2nc
mode
mode_analysis
modis_regrid
mtd
pb2nc
pcp_combine
plot_data_plane
plot_mode_field
plot_point_obs
point2grid
point_stat
regrid_data_plane
rmw_analysis
series_analysis
shift_data_plane
stat_analysis
tc_dland
tc_gen
tc_pairs
tc_rmw
tc_stat
wavelet_stat
wwmca_plot
wwmca_regrid

$METPLUS_BUILD_BASE

The directory where METplus is installed

echo ${METPLUS_BUILD_BASE}

Example value:

/home/metplus_user/METplus-5.0.1
ls ${METPLUS_BUILD_BASE} -1

Example contents:

build_components
docs
environment.yml
internal
manage_externals
metplus
parm
produtil
pyproject.toml
README.md
requirements.txt
setup.py
ush

$METPLUS_DATA

The directory containing sample input data to use for the tutorial

echo ${METPLUS_DATA}

Example value:

/d1/metplus_user/METplus_Data
ls ${METPLUS_DATA} -1

Example contents:

met_reformat
met_test
model_applications
mccabe Wed, 06/30/2021 - 13:59

METplus Overview

METplus Overview

METplus Overview

The following content will discuss and demonstrate some of the basic concepts of the METplus wrappers. This will include discussion of the repository structure and configuration files, as well as how to run a simple example and some associated settings that users will change the most frequently as they continue to work with the METplus system.

As you progress through later tutorial sessions, additional examples of the METplus wrappers will be provided that focus more on their specific MET tool configurable settings and use case based examples.

Proceed to the next page to start with a overview of the METplus directory layout.

jopatz Wed, 01/25/2023 - 15:24

METplus: Directories and Configuration Files - Overview

METplus: Directories and Configuration Files - Overview

METplus directory structure

A brief description and overview of the METplus/ directory structure can be found in the METplus User's Guide section called METplus Wrappers Directory Structure. The files/directories in ${METPLUS_BUILD_BASE} should match this list.

ls ${METPLUS_BUILD_BASE}

METplus default configuration file

Look inside the directory ${METPLUS_BUILD_BASE}/parm

ls ${METPLUS_BUILD_BASE}/parm

Look at the METplus default configuration file:

ls ${METPLUS_BUILD_BASE}/parm/metplus_config

The METplus default configuration file (defaults.conf) is always read first. Any additional configuration files passed in on the command line are then processed in the order in which they are specified. This allows for each successive conf file the ability to override variables defined in any previously processed conf files. It also allows for defining and setting up conf files from a general (settings used by all use cases, ie. MET install dir) to more specific (Plot type when running track and intensity plotter) structure. The idea is to created a hiearchy of conf files that is easier to maintain, read, and manage. It is important to note, running METplus creates a single configuration file, which can be viewed to understand the result of all the conf file processing.

When METplus is run, the final metplus conf file is generated here:
metplus_runtime.conf:METPLUS_CONF={OUTPUT_BASE}/metplus_final.conf.YYYYMMDDHHmmss

This is a unique final config file for each METplus run. Use this file to see the result of all the conf file processing; this can be very helpful when troubleshooting.

NOTE: The syntax for METplus configuration files MUST include a "[config]" section header with the variable names and values on subsequent lines.

More information about the default configuration variables can be found in the METplus User's Guide section called Default Configuration File.
 

The met_config directory (in ${METPLUS_BUILD_BASE}/parm) contains "wrapped" MET configuration files that are used by calls to the MET applications via the METplus wrappers. The wrappers set environment variables that control settings in the wrapped MET configuration files through these environment variables. See the METplus User's Guide section called How METplus controls MET configuration variables for more information.

METplus Use Cases

The use_cases directory - this is where the use cases you will be running exist. Under the use_cases directory are two directories: met_tool_wrapper and model_applications. The met_tool_wrapper directory contains use cases that run a single METplus wrapper. They are a good starting point to see how the wrapper scripts generate commands that run the MET tools. The model_applications directory contains more complex use cases that often run multiple wrappers and demonstrate real evaluations from users.

MET Tool Wrapper Use Cases

Look at the MET Tool Wrapper Use Cases

ls ${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper

The met_tool_wrapper use case files are organized into subdirectories by wrapper, e.g. Example or GridStat. They contain METplus configuration files (ending with .conf). If a MET tool that is called by a use case uses a MET configuration file, the file used for the met_tool_wrapper use cases is found in parm/met_config.

  • met_tool_wrapper/Example - directory
  • met_tool_wrapper/Example/Example.conf - use case configuration file
  • met_tool_wrapper/GridStat - directory
  • met_tool_wrapper/GridStat/GridStat.conf - use case configuration file
  • parm/met_config/GridStatConfig_wrapped - MET configuration file used in the GridStat.conf use case

 

Model Application Use Cases

Look at the Model Application use cases

ls ${METPLUS_BUILD_BASE}/parm/use_cases/model_applications

The model_applications use case files are organized in directories by category, e.g. precipitation or data_assimilation. They contain METplus configuration files (ending with .conf). If additional files such as Python Embedding scripts are included with a use case, these files are found in a directory named after the METplus configuration file without the .conf extension.

  • model_applications/data_assimilation - directory
  • model_applications/data_assimilation/StatAnalysis_fcstHAFS_obsPrepBufr_JEDI_IODA_interface.conf - use case configuration file
  • model_applications/data_assimilation/StatAnalysis_fcstHAFS_obsPrepBufr_JEDI_IODA_interface - directory containing supplemental files for the use case
  • model_applications/data_assimilation/StatAnalysis_fcstHAFS_obsPrepBufr_JEDI_IODA_interface/read_ioda_mpr.py - script called by the use case

 

Example Use Case (aka "Hello World" Example - METplus style)

Let's look at the Example use case, Example.conf, under met_tool_wrapper/Example

less ${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/Example/Example.conf

In this file are variables, denoted in ALL_CAPS. These can be modified as needed. In the METplus system, unmodified config files remain in the directories under ${MET_BUILD_BASE}, while user-modified files are stored in ${METPLUS_TUTORIAL_DIR}/user_config.

No changes are needed in Example.conf. Close it and continue to the next page.

Unless otherwise indicated, all directories are relative to your ${METPLUS_TUTORIAL_DIR} directory.

admin Mon, 06/24/2019 - 15:59

METplus: User Configuration Settings

METplus: User Configuration Settings

Modify your Tutorial/User conf files

In this section you will modify the configuration files that will be read for each call to METplus.

The paths in this practical session guide assume:

  • You have created a user_config directory in your ${METPLUS_TUTORIAL_DIR} directory
  • You have added the shared METplus ush directory to your PATH (done in the Tutorial Setup script)
  • You are using the shared installation of MET.

If not, then you need to adjust accordingly.

  1. Change to the ${METPLUS_TUTORIAL_DIR} directory.  Try running run_metplus.py. You should see the usage statement output to the screen.
cd ${METPLUS_TUTORIAL_DIR};
run_metplus.py
The METplus python script run_metplus.py can be run from anywhere, but for consistency, we will change to ${METPLUS_TUTORIAL_DIR} so that all the directories including user_config and output are below the working directory.
  1. Now try to pass in the example.conf configuration file found in your parm directory under use_cases/met_tool_wrapper/Examples tutorial.conf
run_metplus.py \
${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/Example/Example.conf

You should see output like this:

Running METplus 5.0.1

Starting METplus v5.0.1

12/15 19:53:06.223 metplus (config_metplus.py:102) INFO: Starting METplus configuration setup.
12/15 19:53:06.225 metplus (config_metplus.py:230) INFO: Parsing config file: /var/autofs/mnt/linux-amd64/debian/buster/local/METplus-5.0.1/parm/metplus_config/defaults.conf
12/15 19:53:06.226 metplus (config_metplus.py:230) INFO: Parsing config file: /var/autofs/mnt/linux-amd64/debian/buster/local/METplus-5.0.1/parm/use_cases/met_tool_wrapper/Example/Example.conf
Traceback (most recent call last):
  File "/usr/local/METplus-5.0.1/ush/run_metplus.py", line 128, in <module>
    main()
  File "/usr/local/METplus-5.0.1/ush/run_metplus.py", line 44, in main
    config = pre_run_setup(config_inputs)
  File "/usr/local/METplus-5.0.1/metplus/util/run_util.py", line 23, in pre_run_setup
    config = setup(config_inputs)
  File "/usr/local/METplus-5.0.1/metplus/util/config_metplus.py", line 108, in setup
    config = launch(override_list)
  File "/usr/local/METplus-5.0.1/metplus/util/config_metplus.py", line 256, in launch
    mkdir_p(config.getdir('OUTPUT_BASE'))
  File "/usr/local/METplus-5.0.1/metplus/util/config_metplus.py", line 693, in getdir
    raise ValueError(f"{dir_name} cannot be set to "
ValueError: OUTPUT_BASE cannot be set to or contain '/path/to'

ERROR: run_metplus  failed: OUTPUT_BASE cannot be set to or contain '/path/to'

 

Note it ends with an error message stating that OUTPUT_BASE was not set correctly. You will need to configure the METplus wrappers to be able to run a use case.

The values in defaults.conf are read in first when you run run_metplus.py. The settings in these files can be overridden in the use case conf files and/or a user's custom configuration file.

Some variables in the system conf are set to '/path/to' and must be overridden to run METplus, such as OUTPUT_BASE in defaults.conf.

  1. View the defaults.conf file and notice how OUTPUT_BASE = /path/to . This implies it is REQUIRED to be overridden to a valid path.
less ${METPLUS_BUILD_BASE}/parm/metplus_config/defaults.conf
Note: The default installation of METplus has /path/to values for MET_INSTALL_DIR and INPUT_BASE. The value for MET_INSTALL_DIR is set in the shared METplus configuration when it was installed. This was done because these settings will likely be set to the same values for all users. If METplus was installed on a machine that has sample input data available, the value for INPUT_BASE is often set to that directory as well.
  1. View the tutorial configuration files in your ${METPLUS_TUTORIAL_DIR} directory.
less ${METPLUS_TUTORIAL_DIR}/tutorial.conf

The INPUT_BASE, OUTPUT_BASE, and MET_INSTALL_DIR variables must all be set to run METplus. Since MET_INSTALL_DIR (and possibly INPUT_BASE) should already be set in the default METplus configuration file (completed on install of METplus), only OUTPUT_BASE is required to run. If INPUT_BASE is not set in the tutorial configuration file, it should be set correctly in the defaults configuration file.

Note: A METplus conf file is not a shell script. You CANNOT refer to environment variables as you would in a shell script or command prompt, i.e. ${HOME}. Instead, you must reference the environment variable $HOME as {ENV[HOME]}
You can create additional configuration files to be read by the METplus wrappers to override variables. If a variable is found in multiple configuration files that were passed to METplus, the value used will be the last one read in the sequence of configuration files. See the metplus_final.conf file in the output directory to see what values were actually used for a given METplus run.

We will test out using these configurations on the next page.

admin Mon, 06/24/2019 - 16:00

METplus: How to Run with Example.conf

METplus: How to Run with Example.conf

Running METplus

Running METplus involves invoking the python script run_metplus.py followed by a list of configuration files.

Reminder: The default configuration file (defaults.conf) is always read in and processed first before the configuration files passed in on the command line.

If you have configured METplus correctly and call run_metplus.py without passing in any configuration files, it will generate a usage statement to indicate that other config files are required to perform a useful task. It will generate an error statement if something is amiss.

  1. Review the Example.conf configuration file - which is METplus' version of a "Hello World" example
less ${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/Example/Example.conf
  1. Call the run_metplus.py script again, this time passing in the Example.conf configuration file and the tutorial.conf configuration file. You should see logs output to the screen.
run_metplus.py \
${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/Example/Example.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf

Note: The environment variable METPLUS_BUILD_BASE determines where to look for paths to use_case files. The METPLUS_TUTORIAL_DIR environment variable determines where to look for user-modified config files.

  1. Check the directory specified by the OUTPUT_BASE configuration variable. You should see that files and sub-directories have been created
ls ${METPLUS_TUTORIAL_DIR}/output
  1. Review the METplus log file to see what was run. Compare the log output to the Example.conf configuration file to see how they correspond to each other. The log file will have today's date in the filename. Since METplus was configured to list today's timestamp in YYYYMMDDHHMMSS format, each run of METplus will generate a separate log file. This is the same behavior as the METplus final configuration file. List all of the log files and view the latest METplus log file:
cat `ls -1 ${METPLUS_TUTORIAL_DIR}/output/logs/metplus.log.*`

You will notice that METplus ran for 5 valid times, processing 4 forecast hours for each valid time. For each run time, it ran twice using two different input templates to find files.

metplus INFO: ****************************************
metplus INFO: * Running METplus
metplus INFO: * at valid time: 201702010000
metplus INFO: ****************************************
metplus.Example INFO: Running ExampleWrapper at valid time 20170201000000
metplus.Example INFO: Input directory is /dir/containing/example/data
metplus.Example INFO: Input template is {init?fmt=%Y%m%d}/file_{init?fmt=%Y%m%d}_{init?fmt=%2H}_F{lead?fmt=%3H}.{custom?fmt=%s}
metplus.Example INFO: Processing custom string: ext
metplus.Example INFO: Processing forecast lead 3 hours initialized at 2017-01-31 21Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_21_F003.ext
metplus.Example INFO: Processing custom string: nc
metplus.Example INFO: Processing forecast lead 3 hours initialized at 2017-01-31 21Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_21_F003.nc
metplus.Example INFO: Processing custom string: ext
metplus.Example INFO: Processing forecast lead 6 hours initialized at 2017-01-31 18Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_18_F006.ext
metplus.Example INFO: Processing custom string: nc
metplus.Example INFO: Processing forecast lead 6 hours initialized at 2017-01-31 18Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_18_F006.nc
metplus.Example INFO: Processing custom string: ext
metplus.Example INFO: Processing forecast lead 9 hours initialized at 2017-01-31 15Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_15_F009.ext
metplus.Example INFO: Processing custom string: nc
metplus.Example INFO: Processing forecast lead 9 hours initialized at 2017-01-31 15Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_15_F009.nc
metplus.Example INFO: Processing custom string: ext
metplus.Example INFO: Processing forecast lead 12 hours initialized at 2017-01-31 12Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_12_F012.ext
metplus.Example INFO: Processing custom string: nc
metplus.Example INFO: Processing forecast lead 12 hours initialized at 2017-01-31 12Z and valid at 2017-02-01 00Z
metplus.Example INFO: Looking in input directory for file: 20170131/file_20170131_12_F012.nc
metplus INFO: ****************************************
metplus INFO: * Running METplus
metplus INFO: * at valid time: 201702010600
metplus INFO: ****************************************
...
  1. Now run METplus passing in the Example.conf and tutorial.conf files from the previous run AND an explicit override of the OUTPUT_BASE variable.
run_metplus.py \
${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/Example/Example.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.OUTPUT_BASE=${METPLUS_TUTORIAL_DIR}/output/changed
  1. Check the directory that corresponds to the config.OUTPUT_BASE value that you set in the command line override. You should see that files and sub-directories have been created in the new location. Note that you can reference an environment variable when setting values on the command line.
ls ${METPLUS_TUTORIAL_DIR}/output/changed

Remember: Additional conf files and config variable overrides are processed after the default METplus config file (defaults.conf). OUTPUT_BASE was set in tutorial.conf and then overridden with config.OUTPUT_BASE.
Order matters, since each successive conf file and/or explicit variable override will take precedence over any value set for variables defined previously.

Note: The processing order allows for structuring your conf files to contain system/user configurations (settings used for every run) and use case specific configurations (settings only used for a given use case).

admin Mon, 06/24/2019 - 16:04

Modifying Timing Control in Example.conf

Modifying Timing Control in Example.conf

Timing Control in METplus

METplus configuration variables that control timing information are described in the Timing Control section of the System Configuration chapter in the METplus User's Guide. The Example wrapper is a good tool to help understand how these settings control what is run by the METplus wrappers.

  1. Copy the Example.conf configuration file in your user_config directory, renaming it Example_timing.conf
cp ${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/Example/Example.conf \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf
  1. Open the new Example_timing.conf file with an editor.
vi ${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf
  1. Make the following changes:

3a. Change VALID_BEG and VALID_END:

VALID_BEG = 2017020100
VALID_END = 2017020200

    to:

VALID_BEG = 2022011809
VALID_END = 2022011809

3b. Change LEAD_SEQ:

LEAD_SEQ = 3H , 6H, 9H, 12H

    to:

LEAD_SEQ = 3H

3c. Change EXAMPLE_CUSTOM_LOOP_LIST:

EXAMPLE_CUSTOM_LOOP_LIST = ext, nc

    to:

EXAMPLE_CUSTOM_LOOP_LIST = ext
Note that the valid begin and end time are now set to the same time (January 18, 2022 at 9Z) and there is only 1 forecast lead time (3 hours).

Also note that "ext" stands for extension and "nc" is the typical extension for a netCDF file. The initial example demonstrates how you can have METplus loop over two types of files but the removal of "nc" results in METplus only looking for files that end with "ext".

  1. Call the run_metplus.py script again, this time passing in the Example_timing.conf configuration file and the tutorial.conf configuration file. You should see logs output to the screen.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf
  1. Review the screen output and/or log file output.
Notice that only a single time was run and the initialization time was computed based on the valid time and forecast lead.
INFO: Processing forecast lead 3 hours initialized at 2022-01-18 06Z and valid at 2022-01-18 09Z
  1. Open Example_timing.conf again to increase the frequency of output by making the following changes:

6a. Change VALID_END:

VALID_END = 2022011809

    to:

VALID_END = 2022011815
Notice that two valid times should now be run (January 18, 2022 at 9Z and January 18, 2022 at 15Z).
  1. Call the run_metplus.py script again and see how the output has changed.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf
Notice that now 2 valid times were run and the initialization time was computed based on each valid time and forecast lead.
INFO: Processing forecast lead 3 hours initialized at 2022-01-18 06Z and valid at 2022-01-18 09Z
INFO: Processing forecast lead 3 hours initialized at 2022-01-18 12Z and valid at 2022-01-18 15Z
  1. Change the timing settings to loop by initialization (or retrospective) time. Open Example_timing.conf again and make the following changes:

8a. Change LOOP_BY:

LOOP_BY = VALID

    to:

LOOP_BY = INIT

8b. Change VALID_TIME_FMT to INIT_TIME_FMT:

VALID_TIME_FMT = %Y%m%d%H

    to:

INIT_TIME_FMT = %Y%m%d%H

8c. Change VALID_BEG to INIT_BEG:

VALID_BEG = 2022011809

    to:

INIT_BEG = 2022011809

8d. Change VALID_END to INIT_END:

VALID_END = 2022011815

    to:

INIT_END = 2022011815

8e. Change VALID_INCREMENT to INIT_INCREMENT:

VALID_INCREMENT = 3H

    to:

INIT_INCREMENT = 3H
  1. Call the run_metplus.py script once again and see how the output has changed.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf
Notice that now 2 initialization times were run and the valid time was computed based on each init time and forecast lead.
INFO: Processing forecast lead 3 hours initialized at 2022-01-18 09Z and valid at 2022-01-18 12Z
INFO: Processing forecast lead 3 hours initialized at 2022-01-18 15Z and valid at 2022-01-18 18Z
  1. Change the value for LOOP_BY from INIT to its nickname RETRO. Open Example_timing.conf again and make the following changes:

10a. Change LOOP_BY:

LOOP_BY = INIT

    to:

LOOP_BY = RETRO
  1. Call the run_metplus.py script another time.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf
Notice that the output did not change. Note that LOOP_BY can also be set to RETRO instead of INIT if that term is preferred. However, the other timing variables must start with INIT_ and not RETRO_. The same applies for using REALTIME instead of VALID.
lisag Wed, 01/12/2022 - 13:04

Modifying Filename Template in Example.conf

Modifying Filename Template in Example.conf

Filename Template Settings in METplus

METplus configuration variables that control filename templates are described in the Directory and Filename Template Info section of the System Configuration chapter in the METplus User's Guide. The Example wrapper is a good tool to help understand how these settings control what is run by the METplus wrappers.

  1. List the sample available in the ${METPLUS_DATA} directory
ls ${METPLUS_DATA}/met_test/data/sample_fcst/2007033000/

The output should list:

nam.t00z.awip1236.tm00.20070330.grb
Think about what is constant and what may change routinely:

nam.t00z.awip1236.tm00.20070330.grb

This translates into nam.t{init?fmt=%2H}z.awip1236.tm00.{init?fmt=%Y%m%d}.grib,  We will use this in this next exercise.

  1. Copy the Example_timing.conf configuration file in your user_config directory, renaming it Example_filename.conf
cp ${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/user_config/Example_filename.conf
  1. Open the new Example_filename.conf file with an editor.
vi ${METPLUS_TUTORIAL_DIR}/user_config/Example_filename.conf
  1. Make the following changes:

4a. Change EXAMPLE_INPUT_DIR:

EXAMPLE_INPUT_DIR = /dir/containing/example/data

    to:

EXAMPLE_INPUT_DIR = {INPUT_BASE}/met_test/data/sample_fcst

4b. Change EXAMPLE_INPUT_TEMPLATE:

EXAMPLE_INPUT_TEMPLATE = {init?fmt=%Y%m%d}/file_{init?fmt=%Y%m%d}_{init?fmt=%2H}_F{lead?fmt=%3H}.{custom?fmt=%s}

    to:

EXAMPLE_INPUT_TEMPLATE = nam.t{init?fmt=%2H}z.awip1236.tm00.{init?fmt=%Y%m%d}.grib
  1. Call the run_metplus.py script again, this time passing in the Example_timing.conf configuration file and the tutorial.conf configuration file. You should see logs output to the screen.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_filename.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf
  1. Review the screen output and/or log file output.
INFO: Looking in input directory for file: nam.t09z.awip1236.tm00.20220118.grib
...
INFO: Looking in input directory for file: nam.t15z.awip1236.tm00.20220118.grib
Note that none of the files listed are the one listed in Step 1 (nam.t00z.awip1236.tm00.20070330.grb). You will need to make additional modifications to Example.conf to direct METplus to look for the correct file.
  1. Open the new Example_filename.conf file with an editor.
vi ${METPLUS_TUTORIAL_DIR}/user_config/Example_filename.conf
  1. Make the following changes:

8a. Change INIT_BEG and INIT_END:

INIT_BEG = 2022011809
INIT_END = 2022011815

    to:

INIT_BEG = 2007033000
INIT_END = 2007033000
  1. Call the run_metplus.py script again, passing in the modified Example_timing.conf configuration file and the tutorial.conf configuration file. You should see logs output to the screen.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_filename.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf
  1. Review the screen output and/or log file output.  It appears it is now looking for the right file.
INFO: Looking in input directory for file: nam.t00z.awip1236.tm00.20070330.grib
lisag Wed, 01/12/2022 - 13:14

Modifying Log Timestamp using Example.conf

Modifying Log Timestamp using Example.conf

The METplus configuration variable named LOG_TIMESTAMP_TEMPLATE controls to timestamp that is included in the log file names. The default value of LOG_TIMESTAMP_TEMPLATE is %Y%m%d%H%M%S, which will include the year, month, day, hour, minute, and second when the run_metplus.py command was executed. This will create a new log file each time run_metplus.py is called from the command line.

Starting in METplus v5.0.0, the log timestamp is also included by default in the final configuration file that is generated by the METplus wrappers. This helps with debugging because the final config file and its corresponding log file(s) are more easily identified. The path to the final config file is set by METPLUS_CONF.

The values of these settings can be changed by including them in a METplus configuration file that is passed into run_metplus.py. They can also be set directly in the run_metplus.py command using the syntax config.VARIABLE_NAME=VALUE. The following examples will use the latter.

Changing the Log Timestamp Template

In this example we will change the log timestamp template to only include the year, month, and day of the run. This will create a single log file each day. Each call to run_metplus.py will add its log output to the daily file.

  1. Run the Example_timing use case from the previous exercise and override the log timestamp template to only include year, month, and day.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.LOG_TIMESTAMP_TEMPLATE=%Y%m%d
  1. List the contents of the log directory and notice that there is now a log file that matches the format metplus.log.YYYYMMDD
ls -1 ${METPLUS_TUTORIAL_DIR}/output/logs/
  1. Run the Example_filename use case from the previous exercise and again override the log timestamp template to only include YYYYMMDD.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_filename.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.LOG_TIMESTAMP_TEMPLATE=%Y%m%d
  1. Review the log file and notice that it contains the log output from both runs.
less /d1/personal/mccabe/out2/logs/metplus.log.`date +%Y%m%d`
  1. Search for the word "Running" to see each command.
grep Running /d1/personal/mccabe/out2/logs/metplus.log.`date +%Y%m%d`

Adding Run ID in Log Timestamp Template

The METplus configuration variable RUN_ID was added in METplus v5.0.0. This variable contains an 8 character string that is automatically generated by and is unique to each call to run_metplus.py. This variable can be referenced in other METplus configuration variables. This can be useful in a variety of ways. For example, it can be used to distinguish log files for METplus runs that may have started within the same second.

  1. Run the Example_timing use case and override the log timestamp template to include the RUN_ID.
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.LOG_TIMESTAMP_TEMPLATE=%Y%m%d%H%M%S.{RUN_ID}
  1. List the contents of the log directory and notice that there is now a log file that matches the format metplus.log.YYYYMMDDHHMMSS.XXXXXXXX where XXXXXXXX is a random set of characters.
ls -1 ${METPLUS_TUTORIAL_DIR}/output/logs/
mccabe Tue, 01/24/2023 - 11:15

MET Tool: PCP-Combine

MET Tool: PCP-Combine
IMPORTANT NOTE: If you are returning to the tutorial, you must source the tutorial setup script before running the following instructions. If you are unsure if you have done this step, please navigate to the Verify Environment is Set Correctly page.

We now shift to a discussion of the MET PCP-Combine tool and will practice running it directly on the command line.

PCP-Combine Functionality

The PCP-Combine tool is used (if needed) to add, subtract, sum or derive accumulated field values, most commonly precipitation, from several gridded data files into a single NetCDF file containing the desired accumulation period. Its NetCDF output may be used as input to the MET statistics tools. PCP-Combine may be configured to combine any gridded data field you'd like. However, all gridded data files being combined must have already been placed on a common grid. The copygb utility is recommended for re-gridding GRIB files. In addition, the PCP-Combine tool will only sum model files with the same initialization time unless it is configured to ignore the initialization time.

PCP-Combine Usage

View the usage statement for PCP-Combine by simply typing the following:

pcp_combine
Usage: pcp_combine  
  [[-sum] sum_args] | [-add input_files] | [-subtract input_files] | [-derive stat_list input_files]
(Note: "|" means "or")
  [-sum] sum_args Data from multiple files containing the same accumulation interval should be summed up using the arguments provided.
  -add input_files Data from one or more files should be added together where the accumulation interval is specified separately for each input file.
  -subtract input_files Data from exactly two files should be subtracted.
  -derive stat_list input_files The comma-separated list of statistics in "stat_list" (sum, min, max, range, mean, stdev, vld_count) should be derived using data from one or more files.
  out_file Output NetCDF file to be written.
  [-field string] Overrides the default use of accumulated precipitation (optional).
  [-name list] Overrides the default NetCDF variable name(s) to be written (optional).
  [-vld_thresh n] Overrides the default required ratio of valid data (1) (optional).
  [-log file] Outputs log messages to the specified file
  [-v level] Level of logging
  [-compress level] NetCDF file compression

Use the -sum, -add, -subtract, or -derive command line option to indicate the operation to be performed. Each operation has its own set of required arguments.

admin Mon, 06/24/2019 - 16:05

Run Sum Command

Run Sum Command

Since PCP-Combine performs a simple operation and reformatting step, no configuration file is needed.

  1. Start by making an output directory for PCP-Combine and changing directories:
mkdir -p ${METPLUS_TUTORIAL_DIR}/output/met_output/pcp_combine
cd ${METPLUS_TUTORIAL_DIR}/output/met_output/pcp_combine
  1. Now let's run PCP-Combine twice using some sample data that's included with the MET tarball:
pcp_combine \
-sum 20050807_000000 3 20050807_120000 12 \
sample_fcst_12L_2005080712V_12A.nc \
-pcpdir ${METPLUS_DATA}/met_test/data/sample_fcst/2005080700
pcp_combine \
-sum 00000000_000000 1 20050807_120000 12 \
sample_obs_12L_2005080712V_12A.nc \
-pcpdir ${METPLUS_DATA}/met_test/data/sample_obs/ST2ml

 

The "\" symbols in the commands above are used for ease of reading. They are line continuation markers enabling us to spread a long command line across multiple lines. They should be followed immediately by "Enter". You may copy and paste the command line OR type in the entire line with or without the "\".

Both commands run the sum command which searches the contents of the -pcpdir directory for the data required to create the requested accmululation interval.

In the first command, PCP-Combine summed up 4 3-hourly accumulation forecast files into a single 12-hour accumulation forecast. In the second command, PCP-Combine summed up 12 1-hourly accumulation observation files into a single 12-hour accumulation observation. PCP-Combine performs these tasks very quickly.

We'll use these PCP-Combine output files as input for Grid-Stat. So make sure that these commands have run successfully!

admin Mon, 06/24/2019 - 16:13

Output

Output

When PCP-Combine is finished, you may view the output NetCDF files it wrote using the ncdump and ncview utilities. Run the following commands to view contents of the NetCDF files:

ncview sample_fcst_12L_2005080712V_12A.nc &
ncview sample_obs_12L_2005080712V_12A.nc &
ncdump -h sample_fcst_12L_2005080712V_12A.nc
ncdump -h sample_obs_12L_2005080712V_12A.nc

The ncview windows display plots of the precipitation data in these files. The output of ncdump indicates that the gridded fields are named APCP_12, the GRIB code abbreviation for accumulated precipitation. The accumulation interval is 12 hours for both the forecast (3-hourly * 4 files = 12 hours) and the observation (1-hourly * 12 files = 12 hours).

Note, if ncview is not found when you run it on your system, you may need to load it first.  For example, on hera, you can use this command:

 module load ncview

Plot-Data-Plane Tool

The Plot-Data-Plane tool can be run to visualize any gridded data that the MET tools can read. It is a very helpful utility for making sure that MET can read data from your file, orient it correctly, and plot it at the correct spot on the earth. When using new gridded data in MET, it's a great idea to run it through Plot-Data-Plane first:

plot_data_plane \
sample_fcst_12L_2005080712V_12A.nc \
sample_fcst_12L_2005080712V_12A.ps \
'name="APCP_12"; level="(*,*)";'
gv sample_fcst_12L_2005080712V_12A.ps &

Ghostview (gv) can take a little while before it displays.  If you don't have gv on your computer, try using display, or any tool that can visualize PostScript files, e.g.:

display sample_fcst_12L_2005080712V_12A.ps &

Another option is to create a PNG file from the PS file, also rotating it to appear the right way:

convert -rotate 90 sample_fcst_12L_2005080712V_12A.ps \
sample_fcst_12L_2005080712V_12A.png
display sample_fcst_12L_2005080712V_12A.png

Next try re-running the command list above, but add the convert(x)=x/25.4; function to the config string (Hint: after the level setting and ; but before the last closing tick) to change units from millimeters to inches. What happened to the values in the colorbar?

Now, try re-running again, but add the censor_thresh=lt1.0; censor_val=0.0; options to the config string to reset any data values less 1.0 to a value of 0.0. How has your plot changed?

The convert(x) and censor_thresh/censor_val options can be used in config strings and MET config files to transform your data in simple ways.

admin Mon, 06/24/2019 - 16:13

Add and Subtract Commands

Add and Subtract Commands

We have run examples of the PCP-Combine -sum command, but the tool also supports the -add, -subtract, and -derive commands. While the -sum command defines a directory to be searched, for -add, -subtract, and -derive we tell PCP-Combine exactly which files to read and what data to process. The following command adds together 3-hourly precipitation from 4 forecast files, just like we did in the previous step with the -sum command:

pcp_combine -add \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_03.tm00_G212 03 \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_06.tm00_G212 03 \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_09.tm00_G212 03 \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_12.tm00_G212 03 \
add_APCP_12.nc

By default, PCP-Combine looks for accumulated precipitation, and the 03 tells it to look for 3-hourly accumulations. However, that 03 string can be replaced with a configuration string describing the data to be processed, which doesn't have to be accumulated precipation. The configuration string should be enclosed in single quotes. Below, we add together the U and V components of 10-meter wind from the same input file. You would not typically want to do this, but this demonstrates the functionality. We also use the -name command line option to define a descriptive output NetCDF variable name:

pcp_combine -add \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_03.tm00_G212 'name="UGRD"; level="Z10";' \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_03.tm00_G212 'name="VGRD"; level="Z10";' \
add_WINDS.nc \
-name UGRD_PLUS_VGRD

While the -add command can be run on one or more input files, the -subtract command requires exactly two. Let's rerun the wind example from above but do a subtraction instead:

pcp_combine -subtract \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_03.tm00_G212 'name="UGRD"; level="Z10";' \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_03.tm00_G212 'name="VGRD"; level="Z10";' \
subtract_WINDS.nc \
-name UGRD_MINUS_VGRD

Now run Plot-Data-Plane to visualize this output. Use the -plot_range option to specify a the desired plotting range, the -title option to add a title, and the -color_table option to switch from the default color table to one that's good for positive and negative values:

plot_data_plane \
subtract_WINDS.nc \
subtract_WINDS.ps \
'name="UGRD_MINUS_VGRD"; level="(*,*)";' \
-plot_range -15 15 \
-title "10-meter UGRD minus VGRD" \
-color_table ${MET_BUILD_BASE}/share/met/colortables/NCL_colortables/posneg_2.ctable

Now view the results:

gv subtract_WINDS.ps &

 

admin Mon, 06/24/2019 - 16:14

Derive Command

Derive Command

While the PCP-Combine -add and -subtract commands compute exactly one output field of data, the -derive command can compute multiple output fields in a single run. This command reads data from one or more input files and derives the output fields requested on the command line (sum, min, max, range, mean, stdev, vld_count).

Run the following command to derive several summary metrics for both the 10-meter U and V wind components:

pcp_combine -derive min,max,mean,stdev \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_*.tm00_G212 \
-field 'name="UGRD"; level="Z10";' \
-field 'name="VGRD"; level="Z10";' \
derive_min_max_mean_stdev_WINDS.nc

In the above example, we used a wildcard to list multiple input file names.  And we used the -field command line option twice to specify two input fields.  For each input field, PCP-Combine loops over the input files, derives the requested metrics, and writes them to the output NetCDF file.  Run ncview to visualize this output:

ncview derive_min_max_mean_stdev_WINDS.nc &

This output file contains 8 variables: 2 input fields * 4 metrics. Note the output variable names the tool chose.  You can still override those names using the -name command line argument, but you would have to specify a comma-separated list of 8 names, one for each output variable.

johnhg Tue, 07/23/2019 - 17:13

MET Tool: Plot-Data-Plane

MET Tool: Plot-Data-Plane
IMPORTANT NOTE: If you are returning to the tutorial, you must source the tutorial setup script before running the following instructions. If you are unsure if you have done this step, please navigate to the Verify Environment is Set Correctly page.

Whenever getting started with new gridded datasets in MET, users are strongly encouraged to first run the Plot-Data-Plane tool to visualize the data. Doing so confirms that MET can read the input file, extract the desired data, and geolocate and orient it correctly. It is particularly useful in testing the configuration string needed to extract the data from the input file.

In MET, the terminology Data-Plane means a 2-dimensional field of gridded data.

Plot-Data-Plane Functionality

The Plot-Data-Plane tool reads a single 2-dimensional field of gridded data from the specified input file and writes a PostScript output file containing a spatial plot of the data. It plots the data using a configurable color table that is automatically rescaled to the range of values found by default. The ImageMagick convert utility is recommend for converting the PostScript output file to other image file formats, if needed.

Plot-Data-Plane Usage

View the usage statement for Plot-Data-Plane by simply typing the following:
plot_data_plane
Usage: plot_data_plane  
  input_filename Input file containing gridded data to be be plotted
  output_filename Output PostScript file to be written
  field_string String defining the data to be plotted
  [-color_table name] Overrides the default color table (optional)
  [-plot_range min max] Specifies the range of data to be plotted (optional)
  [-title string] Specifies the plot title string (optional)
  [-log file] Outputs log messages to the specified file
  [-v level] Level of logging
johnhg Wed, 11/17/2021 - 09:38

The Field String

The Field String

Defining the field string

As you'll see throughout these exercises, the behavior of the MET and METplus tools is controlled using ASCII configuration files, and you will learn more about those options in the coming sessions. The field_string command line argument is actually processed as a miniature configuration file. In fact, that string is written to a temporary file which is then read by MET's configuration file library code.

In general, the name and level entries are required to extract a gridded field of data from a supported input file format. The conventions for specifying them vary based on the input file type:

  1. For GRIB1 or GRIB2 inputs, set name as the abbreviation for the desired variable or data type that appears in the GRIB tables and set level to a single letter (A, Z, P, L, or R) to define the level type followed by a number to define the level value. For example 'name = "TMP"; level = "P500";' extracts 500 millibar temperature from a GRIB file.
  2. For NetCDF inputs, set name as the NetCDF variable name and level to define how to extract a 2-dimensional slice of gridded data from that variable. For example, 'name = "temperature"; level = "(0,1,*,*)";' extracts a 2-dimensional field of data from the last two dimensions of a NetCDF temperature variable using the first two indicies as constant values.
  3. For Python embedding, set name as the python script to be run along with any arguments for that script and do not set level. For example, 'name = "read_my_data.py input.txt";' runs a python script to read data from the specified input file.

Note that all field strings should be enclosed in single quotes, as shown above, so that they are processed on the command line as a single string which may contain embedded whitespace.

Examples for each of these input types are provided in the coming exercises. More details about setting the field string can be found in the Configuration File Overview section of the MET User's Guide. For example, if a field string matches multiple records in an input GRIB file, additional filtering criteria may be specified to further refine them. Additional options exist to explicitly specify the input file_type, define a function to convert the data, or define an operation to censor the data. Each configuration entry should be terminated with a semi-colon (;).

Since the field_string is processed using a temporary file, any syntax errors will produce a parsing error log message similar to the following:

ERROR  : 
ERROR  : yyerror() -> syntax error in file "/tmp/met_config_61354_1"
ERROR  : 

Error messages like this typically mean there is a problem in a configuration string or configuration file being read by MET.

johnhg Fri, 11/19/2021 - 10:33

Plot GRIB Data

Plot GRIB Data

Plot GRIB Data

Start by creating a directory for our Plot-Data-Plane output:

mkdir -p ${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane

Next, run Plot-Data-Plane to plot 2-meter temperature from a GRIB1 input file:

plot_data_plane \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_12.tm00_G212 \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/wrfprs_TMP_2m.ps \
'name = "TMP"; level = "Z2";'

The default verbosity level of 2 only prints log messages about input and output files.

DEBUG 1: Opening data file: {...}/wrfprs_ruc13_12.tm00_G212
DEBUG 1: Creating postscript file: {...}/wrfprs_TMP_2m.ps

Next, re-run at verbosity level 4 to see more detailed log message about the grid being read, timing information, and the range of the data values:

plot_data_plane \
${METPLUS_DATA}/met_test/data/sample_fcst/2005080700/wrfprs_ruc13_12.tm00_G212 \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/wrfprs_TMP_2m.ps \
'name = "TMP"; level = "Z2";' -v 4

It is a great idea to inspect the log messages to sanity check the metadata. Without needing to understand all the details, does the grid definition look reasonable? Are the range of values reasonable for this variable type? Are the timestamps of the data consistent with the input file name?

Next, open the PostScript output file. On some machines, the ghostview utility (or common gv alias) can display PostScript files. On other, the display command works well. On Macs, simply run open. The example below uses gv which is available in the METplus AMI:

gv ${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/wrfprs_TMP_2m.ps

Optionally, if the convert utility is in your path, it can be run to change the image file format. Indicate the desired output file format by specifying the suffix (.png shown below).

which convert
convert -rotate 90 \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/wrfprs_TMP_2m.ps \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/wrfprs_TMP_2m.png

The plot is created using the default color table (met_default.ctable) and is scaled to the range of valid data (275 to 305). By default, no title is provided and the input file is listed as a sub-title. Does the pattern of the data look reasonable? Does it correspond well to the background map data?

If the plot of the data and the metadata listed in the log messages look reasonable, you can be confident that MET is reading your data well. In addition, the field_string you used to retrieve this data can be used in the configuration strings and configuration files for other MET tools.

While this Plot-Data-Plane validation step is not necessary for every input file, it is very useful when getting started with new input data sources.

johnhg Wed, 11/17/2021 - 11:51

Plot NetCDF Data

Plot NetCDF Data

Plot NetCDF Data

The NetCDF file format is very flexible and enables the creation of self-describing data files. However, that flexibility makes it impossible to write general purpose software to interpret all NetCDF files. For that reason, MET supports a few types of NetCDF file formats, but does not support all NetCDF files, in general. It can ingest NetCDF files that follow the Climate-Forecast Convention, are created by the WRF-Interp utility, or are created by other MET tools. Additional details can be found in the MET Data I/O chapter of the MET User's Guide.

As described in The Field String, set name to the name of the desired NetCDF variable and level to define how to index into the dimensions of that variable. In the NetCDF level strings, use *,* to indicate the two gridded dimensions. For other, non-gridded dimensions, pick a 0-based integer to specify the value to be used for that dimension. For the time dimension, if present, selecting a 0-based integer does work, however you can also specify a time string in YYYYMMDD[_HH[MMSS]] format. The square braces indicate optional elements of the format. So 19770807, 19770807_12, and 19770807_120000 are all valid time strings. It is often easier to specify a time string directly rather than finding the integer index corresponding to that time string.

Run Plot-Data-Plane to plot quantitative precipitation estimate (QPE) data from a CF-compliant NetCDF file. First run ncdump -h to take a look at the header:

ncdump -h \
${METPLUS_DATA}/model_applications/precipitation/QPE_Data/20170510/qpe_2017051005.nc

Note the following from the header:

dimensions:
    time = 1 ;
    y = 689 ;
    x = 1073 ;
    ...
   float P06M_NONE(time, y, x) ;
    ...
   // global attributes:
    :Conventions = "CF-1.0" ;

The variable P06M_NONE has 3 dimensions, but the time dimension only has length 1. Therefore, setting level = "(0,*,*)"; should do the trick. Also note that the global Conventions attribute identifies this file as being CF-compliant. Let's plot it, this time adding a title:

plot_data_plane \
${METPLUS_DATA}/model_applications/precipitation/QPE_Data/20170510/qpe_2017051005.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/qpe_2017051005.ps \
'name = "P06M_NONE"; level = "(0,*,*)";' -title "Precipitation Forecast" -v 4

Running at verbosity level 4, we see the range of values:

DEBUG 4: Data plane information:
DEBUG 4:       plane min: 0
DEBUG 4:       plane max: 101.086

If needed, run convert to reformat:

convert -rotate 90 \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/qpe_2017051005.ps \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/qpe_2017051005.png

The next example extracts a specific time string from a precipitation analysis dataset:

ncdump -h \
${METPLUS_DATA}/model_applications/precipitation/StageIV/2017050912_st4.nc

Note the following from the header:

dimensions:
    time = 4 ;
    y = 428 ;
    x = 614 ;
    time1 = 1 ;
variables:
    float P06M_NONE(time, y, x) ;

Let's request a specific time string, specify a title, and use the same plotting range as above, rather than defaulting to the range of input data values.

plot_data_plane \
${METPLUS_DATA}/model_applications/precipitation/StageIV/2017050912_st4.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/2017050912_st4.ps \
'name = "P06M_NONE"; level = "(20170510_06,*,*)";' \
-plot_range 0 101.086 -title "Precipitation Analysis"

If running Plot-Data-Plane for multiple output times or data sources, consider using the -plot_range option to make their color scales comparable.

Lastly, let's plot output a NetCDF mask file that was created by an earlier version of MET's Gen-Vx-Mask tool.

ncdump -h ${METPLUS_DATA}/met_test/data/poly/NCEP_masks/NCEP_Regions.nc

We will plot the NCEP_Region_ID variable with only 2 dimensions:

float NCEP_Region_ID(lat, lon) ;

Let's specify a different color table rather than using the default one.

plot_data_plane \
${METPLUS_DATA}/met_test/data/poly/NCEP_masks/NCEP_Regions.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/NCEP_Regions.ps \
'name = "NCEP_Region_ID"; level = "(*,*)";' \
-color_table $MET_BUILD_BASE/share/met/colortables/NCL_colortables/rainbow.ctable
When working with NetCDF files in MET, running ncdump -h is a great way to check their contents.
johnhg Wed, 11/17/2021 - 11:53

Python Embedding

Python Embedding

Python Embedding

While the MET tools can read data from a few input gridded data file types, its ability to read data in memory from python greatly enhances its utility. Support for python embedding is optional, and must be enabled at compilation time as described in Appendix F of the MET User's Guide. MET supports three types of python embedding:

  1. Reading a field of gridded data values.
  2. Passing a list of point observations.
  3. Passing a list of matched forecast/observation pair values.

On this page, we'll demonstrate only the first type, reading a field of gridded data values. When getting started with a new dataset via python, validating the logic by running Plot-Data-Plane is crucial. When MET reads data from flat files, important information, like the location and orientation of the data, is defined in the metadata. That is not the case for python embedding, and the responsibility for confirming those details falls to the user. So while python embedding provide extra flexibility, it also requires additional diligence.

Let's run the simplest of examples using sample data included with the MET release. The first step is to confirm that python script runs by itself, outside of MET.

python3 ${METPLUS_DATA}/met_test/scripts/python/read_ascii_numpy.py \
${METPLUS_DATA}/met_test/data/python/fcst.txt Forecast

This sample read_ascii_numpy.py script reads data from the input fcst.txt ASCII file and gives it a name, Forecast. Always run new python scripts on the command line first to confirm there aren't any syntax errors in the script itself. The required conventions for the python script are details in the Python Embedding for 2D Data section of the MET User's Guide.

Next, let's run Plot-Data-Plane using this python script to define the input data. As described in The Field String, this is done with the name configuration string and the level string does not apply.

plot_data_plane \
PYTHON_NUMPY \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/python_fcst.ps \
'name = "${METPLUS_DATA}/met_test/scripts/python/read_ascii_numpy.py ${METPLUS_DATA}/met_test/data/python/fcst.txt Forecast";'

Since there is no input_filename to be specified as the first required argument for Plot-Data-Plane, we provide the constant string PYTHON_NUMPY in that spot. This triggers Plot-Data-Plane to interpret the field string as a python embedding script to be run. Specifying PYTHON_XARRAY also works but requires slightly different conventions in the python embedding script.

When MET is compiled, it links to python libraries that it uses to instantiate a python interpreter at runtime. That compile time instance does have a few required packages, but will likely not include all packages that every user may want to load. You may find that your python script runs fine on the command line, but Plot-Data-Plane's call to python can't load a requested module. In that case, set the ${MET_PYTHON_EXE} environment variable to tell MET which instance of python you'd like to run.

${MET_PYTHON_EXE} defines a specific instance of python to be run.

The following command just uses that version of python that is already present in your path:

export MET_PYTHON_EXE=`which python3`

Rerunning the command from above should produce the same result, but if you look closely at the log messages, you'll see that your custom python version writes a temporary file, and MET's compile time python version reads data from it.

plot_data_plane \
PYTHON_NUMPY \
${METPLUS_TUTORIAL_DIR}/output/met_output/plot_data_plane/python_fcst.ps \
'name = "${METPLUS_DATA}/met_test/scripts/python/read_ascii_numpy.py ${METPLUS_DATA}/met_test/data/python/fcst.txt Forecast";'

You can find several python embedding examples on the Sample Analysis Scripts page of the MET website. Each example includes both a python script and sample input data file. Please also see METplus Python Embedding use case examples. 

johnhg Wed, 11/17/2021 - 11:55

MET Tool: Gen-Vx-Mask

MET Tool: Gen-Vx-Mask
IMPORTANT NOTE: If you are returning to the tutorial, you must source the tutorial setup script before running the following instructions. If you are unsure if you have done this step, please navigate to the Verify Environment is Set Correctly page.

Gen-Vx-Mask Functionality

The Gen-Vx-Mask tool may be run to speed up the execution time of the other MET tools. Gen-Vx-Mask defines a bitmap masking region for your domain. It takes as input a gridded data file defining your domain and a second argument to define the area of interest (varies by masking type). It writes out a NetCDF file containing a bitmap for that masking region. You can run Gen-Vx-Mask iteratively, passing its output back in as input, to define more complex masking regions.

You can then use the output of Gen-Vx-Mask to define masking regions in the MET statistics tools. While those tools can read ASCII lat/lon polyline files directly, they are able to process the output of Gen-Vx-Mask much more quickly than the original polyline. The idea is to define your masking region once for your domain with Gen-Vx-Mask and apply the output many times in the MET statistics tools.

Gen-Vx-Mask Usage

View the usage statement for Gen-Vx-Mask by simply typing the following:

gen_vx_mask
Usage: gen_vx_mask  
  input_file Gridded data file defining the domain
  mask_file Defines the masking region and varies by -type
  out_file Output NetCDF mask file to be written
  -type string Masking type: poly, poly_xy, box, circle, track, grid, data, solar_alt, solar_azi, lat, lon, shape
  [-input_field string] Define field from input_file for grid point initialization values, rather than 0.
  [-mask_field string] Define field from mask_file for data masking.
  [-complement, -union, -intersection, -symdiff] Set logic for combining input_field initialization values with the current mask values.
  [-thresh string] Define threshold for circle, track, data, solar_alt, solar_azi, lat, and lon masking types.
  [-height n, -width n] Define dimensions for box masking.
  [-shapeno n] Define the index of the shape for shapefile masking.
  [-value n] Output mask value to be written, rather than 1.
  [-name str] Specifies the name to be used for the mask.
  [-log file] Outputs log messages to the specified file
  [-v level] Level of logging
  [-compress level] NetCDF compression level

At a minimum, the input data_file, the input mask_poly polyline file, the output netcdf_file, and the type must be passed on the command line.

admin Mon, 06/24/2019 - 16:06

Run Poly Type

Run Poly Type

Start by making an output directory for Gen-Vx-Mask and changing directories:

mkdir -p ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask
cd ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask

Since Gen-Vx-Mask performs a simple masking step, no configuration file is needed.

We'll run the Gen-Vx-Mask tool to apply a polyline for the CONUS (Contiguous United States) to our model domain. Run Gen-Vx-Mask on the command line using the following command:

gen_vx_mask \
${METPLUS_DATA}/met_test/data/sample_obs/ST2ml/ST2ml2005080712.Grb_G212 \
${MET_BUILD_BASE}/share/met/poly/CONUS.poly \
CONUS_mask.nc \
-type poly -v 2

Re-run using verbosity level 3 and look closely at the log messages. How many grid points were included in this mask?

Gen-Vx-Mask should run very quickly since the grid is coarse (185x129 points) and there are 243 lat/lon points in the CONUS polyline. The more you increase the grid resolution and number of polyline points, the longer it will take to run. View the NetCDF bitmap file generated by executing the following command:

ncview CONUS_mask.nc &

Notice that the bitmap has a value of 1 inside the CONUS polyline and 0 everywhere else. We'll use the CONUS mask we just defined in the next step.

You could try running plot_data_plane to create a PostScript image of this masking region. Can you remember how?

Notice that there are several ways that gen_vx_mask can be run to define regions of interest, some of which will be demonstrated over the next few pages.

admin Mon, 06/24/2019 - 16:16

Run Lat/Lon and Grid Types

Run Lat/Lon and Grid Types
After running each of the Gen-Vx-Mask commands on this, and following, pages, users are encouraged to inspect the header of the NetCDF output file by running ncdump -h and view the masking regions by running ncview or plot_data_plane.

Using a pre-defined NCEP grid from the NCEP ON388 Grid Identification Table, we'll create a latitude band, using the "lat" masking type, for the tropics region. Run Gen-Vx-Mask on the command line using the following command:

gen_vx_mask \
G004 \
G004 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_Tropics.nc \
-type lat -thresh 'ge-30 && le30'

Most of the grids defined in ON388 Table B can be referenced in MET as "GNNN" where NNN is the 3-digit grid number. Run "ncdump -h" on the output file and notice that the mask variable is named "lat_mask":

ncdump -h ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_Tropics.nc
float lat_mask(lat, lon) ;
        lat_mask:long_name = "lat_mask masking region" ;
        lat_mask:_FillValue = -9999.f ;
        lat_mask:mask_type = "lat>=-30&&<=30" ;

Use the "-name" command line option, as shown below, to override this default.

To compute the intersection or union of two masks, use the output from the first run as input to the second. Run Gen-Vx-Mask on the command line using the following command, which uses the "lon" masking type:

gen_vx_mask \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_Tropics.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_Tropics.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_EastPac.nc \
-type lon -thresh 'le-70 && ge-130' -intersection -name EastPac

Compare this intersection output to the union of the two masks, computed below:

gen_vx_mask \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_Tropics.nc \ ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_Tropics.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_EastPac.nc \
-type lon -thresh 'le-70 && ge-130' -union

Now we'll use the "grid" masking type to select a subgrid from a larger grid. Run Gen-Vx-Mask on the command line using the following command:

gen_vx_mask \
G004 \
${METPLUS_DATA}/met_test/data/sample_obs/ST2ml/ST2ml2005080712.Grb_G212 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/G004_SUBGRID.nc \
-type grid

On the next page, we'll demonstrate using the "data" and "solar_alt" masking types.

johnhg Fri, 12/10/2021 - 15:04

Run Data and Solar Types

Run Data and Solar Types

On this page, we provide examples for land/sea mask and also a solar altitude to show where it is daytime on a global grid.

Run Gen-Vx-Mask on the command line using the following command:

gen_vx_mask \
${METPLUS_DATA}/model_applications/medium_range/grid_to_obs/gfs/pgbf00.gfs.2017060100 \
${METPLUS_DATA}/model_applications/medium_range/grid_to_obs/gfs/pgbf00.gfs.2017060100 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/GFS_LAND.nc \
-type data -mask_field 'name="LAND"; level="L0";' -thresh eq1 -name LAND

corresponding water mask could be created using two different methods. One way is to simply rerun the land mask command above using the -complement option:

gen_vx_mask \
${METPLUS_DATA}/model_applications/medium_range/grid_to_obs/gfs/pgbf00.gfs.2017060100 \
${METPLUS_DATA}/model_applications/medium_range/grid_to_obs/gfs/pgbf00.gfs.2017060100 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/GFS_LAND_COMP.nc \
-type data -mask_field 'name="LAND"; level="L0";' -thresh eq1 -name LAND_COMP -complement

Another way is to specify a different threshold (eq1 instead of eq0) rather than taking the complement:

gen_vx_mask \
${METPLUS_DATA}/model_applications/medium_range/grid_to_obs/gfs/pgbf00.gfs.2017060100 \
${METPLUS_DATA}/model_applications/medium_range/grid_to_obs/gfs/pgbf00.gfs.2017060100 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/GFS_WATER.nc \
-type data -mask_field 'name="LAND"; level="L0";' -thresh eq0 -name WATER

Now we'll run Gen-Vx-Mask to show where it’s daytime on a global grid:

gen_vx_mask \
G004 \
20170601_183000 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/SOLAR_DAY.nc \
-type solar_alt -thresh ge0 -name DAY

Next, combine the LAND output from the previous run with the solar altitude mask for daytime:

gen_vx_mask \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/GFS_LAND.nc \
20170601_183000 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/DAYLIGHT_LAND.nc \
-type solar_alt -thresh ge0 -name DAYLIGHT_LAND -intersection

This creates a mask for grid points on land experiencing daylight at 18:30 UTC on 20170601.

On the next page, we'll demonstrate using the "track" and "circle" masking types.

 

jpresto Fri, 12/10/2021 - 15:13

Run Track and Circle Types

Run Track and Circle Types

On this page, we provide examples for using the "track" masking type using BEST track hurricane data and "circle" masking type.

Start by extracting the lat/lon locations for Hurricane Dorian:

echo "DORIAN" > ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian.poly
cat ${METPLUS_DATA}/model_applications/medium_range/dorian_data/track_data/bal052019.dat | awk '{printf "%d %d\n", $7, $8}' | awk '{printf "%.1f %.1f\n", $1/(10), $2/(-10)}' | uniq >> ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian.poly

Inspect the dorian.poly file that begins with "DORIAN" and has the lat/lon storm locations on subsequent lines.

cat ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian.poly

Compute a mask for all grid points within 200 km of the Hurricane Dorian track. Run Gen-Vx-Mask using the following command:

gen_vx_mask \
G003 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian.poly \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/Dorian_Track.nc \
-type track -name Dorain_Track -thresh le200

Extract the first track point for Dorian and use the circle masking option to select all grid points within 500 km:

head -2 ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian.poly > ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian_first.poly

Run Gen-Vx-Mask on the command line using the following command:

gen_vx_mask \
G003 \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/dorian_first.poly \
${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask/Dorian_Origin.nc \
-type circle -name Dorain_Track -thresh le500

The "circle" mask type draws a circle around each of the lat/lon points in the input poly file. The "circle" option may be useful when verifying within a certain radius of known radar locations. If you rerun without the "-thresh" command line option, Gen-Vx-Mask still runs but prints a warning message:

WARNING: apply_circle_mask() -> since "-thresh" was not used to specify a threshold in kilometers for circle masking, the minimum distance to the points will be written.

Can you figure out what this output file now contains? Run ncview or plot_data_plane to visualize it.

The "-thresh" option can also be omitted from the "track", "data", "solar_alt", and "solar_azi" masking types. If no threshold is specified, a warning message is printed, and the raw, un-thresholded values are written to the output NetCDF file.

Gen-Vx-Mask also supports the "box", "solar_azi", and "shape" masking types, not covered in these exercises. Interested users can download Natural Earth shapefiles and run Gen-Vx-Mask using the "-type shape" option.

Next, we'll take a look at using the "shape" masking type with Gen-Vx-Mask.

jpresto Fri, 12/10/2021 - 15:14

Run Shape Type

Run Shape Type

We will demonstrate the Gen-Vx-Mask "shape" masking type using freely available shapefiles from Natural Earth.  While multiple resolutions are provided, we'll use the coarsest version for this example since it's the smallest in size.

Download the Natural Earth administrative shapefiles for countries boundaries.

cd ${METPLUS_TUTORIAL_DIR}/output/met_output/gen_vx_mask
mkdir ne_shapefiles; cd ne_shapefiles
wget https://www.naturalearthdata.com/http//www.naturalearthdata.com/download/110m/cultural/ne_110m_admin_0_countries.zip
unzip ne_110m_admin_0_countries.zip

Run gen_vx_mask to define the mask for the USA (index 4).

gen_vx_mask G004 ne_110m_admin_0_countries.shp G004_USA_mask.nc -name USA -type shape -shapeno 4 -v 3

Re-run to compute the union with Canada (index 3).

gen_vx_mask G004_USA_mask.nc ne_110m_admin_0_countries.shp G004_USA_Canda_mask.nc -name USA_Canada -type shape -shapeno 3 -union -v 3

Re-run to add Mexico (index 27).

gen_vx_mask G004_USA_Canda_mask.nc ne_110m_admin_0_countries.shp G004_North_America_mask.nc -name North_America -type shape -shapeno 27 -union -v 3

The result is good but not perfect. There are a few missing grid points along the boundary. But this demonstrates how the tool works. Consider re-running all three commands again, but this time use the "-value" command line option to define the mask value to be written. Just make "-value" match the "-shapeno" option (.e.g. -value 4 for USA, -value 3 for Canada, and -value 27, for Mexico). What impact does that have on the result?

Next, we'll take a look at the functionality that Grid-Stat offers.

jpresto Fri, 12/17/2021 - 12:59

MET Tool: Grid-Stat

MET Tool: Grid-Stat
IMPORTANT NOTE: If you are returning to the tutorial, you must source the tutorial setup script before running the following instructions. If you are unsure if you have done this step, please navigate to the Verify Environment is Set Correctly page.

Grid-Stat Functionality

The Grid-Stat tool provides verification statistics for a matched forecast and observation grid. If the forecast and observation grids do not match, the regrid section of the configuration file controls how the data can be interpolated to a common grid. All of the forecast gridpoints in each spatial verification region of interest are matched to observation gridpoints. The matched gridpoints within each verification region are used to compute the verification statistics.

The output statistics generated by Grid-Stat include continuous partial sums and statistics, vector partial sums and statistics, categorical tables and statistics, probabilistic tables and statistics, neighborhood statistics, and gradient statistics. The computation and output of these various statistics types is controlled by the output_flag in the configuration file.

Grid-Stat Usage

View the usage statement for Grid-Stat by simply typing the following:

grid_stat
Usage: grid_stat  
  fcst_file Input gridded forecast file containing the field(s) to be verified.
  obs_file Input gridded observation file containing the verifying field(s).
  config_file GridStatConfig file containing the desired configuration settings.
  [-outdir path] Overrides the default output directory (optional).
  [-log file] Outputs log messages to the specified file (optional).
  [-v level] Level of logging (optional).
  [-compress level] NetCDF compression level (optional).

The forecast and observation fields must be on the same grid for verification. You can use copygb to regrid GRIB1 files, wgrib2 to regrid GRIB2 files, or the automated regridding within the regrid section of the MET config files.

At a minimum, the input gridded fcst_file, the input gridded obs_file, and the configuration config_file must be passed in on the command line.

admin Mon, 06/24/2019 - 16:06

Configure

Configure

Start by making an output directory for Grid-Stat and changing directories:

mkdir -p ${METPLUS_TUTORIAL_DIR}/output/met_output/grid_stat
cd ${METPLUS_TUTORIAL_DIR}/output/met_output/grid_stat

The behavior of Grid-Stat is controlled by the contents of the configuration file passed to it on the command line. The default Grid-Stat configuration file may be found in the data/config/GridStatConfig_default file. Prior to modifying the configuration file, users are advised to make a copy of the default:

cp ${MET_BUILD_BASE}/share/met/config/GridStatConfig_default GridStatConfig_tutorial

Open up the GridStatConfig_tutorial file for editing with your preferred text editor.

vi GridStatConfig_tutorial

The configurable items for Grid-Stat are used to specify how the verification is to be performed. The configurable items include specifications for the following:

  • The forecast fields to be verified at the specified vertical level or accumulation interval
  • The threshold values to be applied
  • The areas over which to aggregate statistics - as predefined grids, configurable lat/lon polylines, or gridded data fields
  • The confidence interval methods to be used
  • The smoothing methods to be applied (as opposed to interpolation methods)
  • The types of verification methods to be used

You may find a complete description of the configurable items in the grid_stat configuration file section of the MET User's Guide. Please take some time to review them.

For this tutorial, we'll configure Grid-Stat to verify the 12-hour accumulated precipitation output of PCP-Combine. We'll be using Grid-Stat to verify a single field using NetCDF input for both the forecast and observation files. However, Grid-Stat may in general be used to verify an arbitrary number of fields. Edit the GridStatConfig_tutorial file as follows:

  • Set:
    fcst = {
       field = [
          {
            name       = "APCP_12";
            level      = [ "(*,*)" ];
            cat_thresh = [ >0.0, >=5.0, >=10.0 ];
          }
       ];
    }
    obs = fcst;

    To verify the field of precipitation accumulated over 12 hours using the 3 thresholds specified.

  • Set:
    mask = {
       grid = [];
       poly = [ "../gen_vx_mask/CONUS_mask.nc",
                "MET_BASE/poly/NWC.poly",
                "MET_BASE/poly/SWC.poly",
                "MET_BASE/poly/GRB.poly",
                "MET_BASE/poly/SWD.poly",
                "MET_BASE/poly/NMT.poly",
                "MET_BASE/poly/SMT.poly",
                "MET_BASE/poly/NPL.poly",
                "MET_BASE/poly/SPL.poly",
                "MET_BASE/poly/MDW.poly",
                "MET_BASE/poly/LMV.poly",
                "MET_BASE/poly/GMC.poly",
                "MET_BASE/poly/APL.poly",
                "MET_BASE/poly/NEC.poly",
                "MET_BASE/poly/SEC.poly" ];
    }

    To accumulate statistics over the Continental United States (CONUS) and the 14 NCEP verification regions in the United States defined by the polylines specified. To see a plot of these regions, execute the following command:

    gv ${MET_BUILD_BASE}/share/met/poly/ncep_vx_regions.pdf &

  • In the boot dictionary, set:
    n_rep = 500;

    To turn on the computation of bootstrap confidence intervals using 500 replicates.

  • In the nbrhd dictionary, set:
    width          = [ 3, 5 ];
    cov_thresh = [ >=0.5, >=0.75 ];

    To define two neighborhood sizes and two fractional coverage field thresholds.

  • Set:
    output_flag = {
       fho     = NONE;
       ctc      = BOTH;
       cts      = BOTH;
       mctc   = NONE;
       mcts   = NONE;
       cnt     = BOTH;
       sl1l2   = BOTH;
       sal1l2 = NONE;
       vl1l2   = NONE;
       val1l2 = NONE;
       vcnt = NONE;
       pct     = NONE;
       pstd   = NONE;
       pjc     = NONE;
       prc     = NONE;
       eclv    = NONE;
       nbrctc = BOTH;
       nbrcts = BOTH;
       nbrcnt = BOTH;
       grad    = BOTH;
       dmap  = NONE;
       seeps  = NONE;
    }

    To compute contingency table counts (CTC), contingency table statistics (CTS), continuous statistics (CNT), scalar partial sums (SL1L2), neighborhood contingency table counts (NBRCTC), neighborhood contingency table statistics (NBRCTS), and neighborhood continuous statistics (NBRCNT).

admin Mon, 06/24/2019 - 16:17

Run

Run

Next, run Grid-Stat on the command line using the following command:

grid_stat \
${METPLUS_TUTORIAL_DIR}/output/met_output/pcp_combine/sample_fcst_12L_2005080712V_12A.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/pcp_combine/sample_obs_12L_2005080712V_12A.nc \
${METPLUS_TUTORIAL_DIR}/output/met_output/grid_stat/GridStatConfig_tutorial \
-outdir ${METPLUS_TUTORIAL_DIR}/output/met_output/grid_stat \
-v 2

Grid-Stat is now performing the verification tasks we requested in the configuration file. It should take a minute or two to run. The status messages written to the screen indicate progress.

In this example, Grid-Stat performs several verification tasks in evaluating the 12-hour accumulated precipiation field:

  • For continuous statistics and partial sums (CNT and SL1L2), 15 output lines each:
    (1 field * 15 masking regions)
  • For contingency table counts and statistics (CTC and CTS), 45 output lines each:
    (1 field * 3 raw thresholds * 15 masking regions)
  • For neighborhood methods (NBRCNT, NBRCTC, and NBRCTS), 90 output lines each:
    (1 field * 3 raw thresholds * 2 neighborhood sizes * 15 masking regions)

To greatly increase the runtime performance of Grid-Stat, you could disable the computation of bootstrap confidence intervals in the configuration file. Edit the GridStatConfig_tutorial file as follows:

vi GridStatConfig_tutorial
  • In the boot dictionary, set:
    n_rep = 0;

    To disable the computation of bootstrap confidence intervals.

Now, try rerunning the Grid-Stat command listed above and notice how much faster it runs. While bootstrap confidence intervals are nice to have, they take a long time to compute, especially for gridded data.

admin Mon, 06/24/2019 - 16:17

Output

Output

The output of Grid-Stat is one or more ASCII files containing statistics summarizing the verification performed and a NetCDF file containing difference fields. In this example, the output is written to the current directory, as we requested on the command line. It should now contain 10 Grid-Stat output files beginning with the grid_stat_ prefix, one each for the CTC, CTS, CNT, SL1L2, GRAD, NBRCTC, NBRCTS, and NBRCNT ASCII files, a STAT file, and a NetCDF matched pairs file.

The format of the CTC, CTS, CNT, and SL1L2 ASCII files will be covered for the Point-Stat tool. The neighborhood method and gradient output are unique to the Grid-Stat tool.

  • Rather than comparing forecast/observation values at individual grid points, the neighborhood method compares areas of forecast values to areas of observation values. At each grid box, a fractional coverage value is computed for each field as the number of grid points within the neighborhood (centered on the current grid point) that exceed the specified raw threshold value. The forecast/observation fractional coverage values are then compared rather than the raw values themselves.
  • Gradient statistics are computed on the forecast and observation gradients in the X and Y directions.

Since the lines of data in these ASCII files are so long, we strongly recommend configuring your text editor to NOT use dynamic word wrapping. The files will be much easier to read that way.

Execute the following command to view the NetCDF output of Grid-Stat:

ncview grid_stat_120000L_20050807_120000V_pairs.nc &

Click through the 2d vars variable names in the ncview window to see plots of the forecast, observation, and difference fields for each masking region. If you see a warning message about the min/max values being zero, just click OK.

Now dump the NetCDF header:

ncdump -h grid_stat_120000L_20050807_120000V_pairs.nc

View the NetCDF header to see how the variable names are defined.

Notice how *MANY* variables there are, separate output for each of the masking regions defined. Try editing the config file again by setting apply_mask = FALSE; and gradient = TRUE; in the nc_pairs_flag dictionary. Re-run Grid-Stat and inspect the output NetCDF file. What affect did these changes have?

admin Mon, 06/24/2019 - 16:18

METplus Motivation

METplus Motivation

We have now successfully run the PCP-Combine and Grid-Stat tools to verify 12-hourly accumulated preciptation for a single output time. We did the following steps:

  • Identified our forecast and observation datasets.
  • Constructed PCP-Combine commands to put them into a common accumulation interval.
  • Configured and ran Grid-Stat to compute our desired verification statistics.

Now that we've defined the logic for a single run, the next step would be writing a script to automate these steps for many model initializations and forecast lead times. Rather than every MET user rewriting the same type of scripts, use METplus to automate these steps in a use case!

admin Mon, 06/24/2019 - 16:19

METplus Use Case: GridStat

METplus Use Case: GridStat
IMPORTANT NOTE: If you are returning to the tutorial, you must source the tutorial setup script before running the following instructions. If you are unsure if you have done this step, please navigate to the Verify Environment is Set Correctly page.

The GridStat use case utilizes the MET Grid-Stat tool.

Optional: Refer to the MET Users Guide for a description of the MET tools used in this use case.
Optional: Refer to the METplus Config Glossary section of the METplus Users Guide for a reference to METplus variables used in this use case.

Change to the ${METPLUS_TUTORIAL_DIR}

cd ${METPLUS_TUTORIAL_DIR}
  1. Review the use case configuration file: GridStat.conf

Open the file and look at all of the configuration variables that are defined. This use-case shows a simple example of running Grid-Stat on 3-hour accumulated precipitation forecasts from WRF to Stage II quantitative precipitation estimates.

less ${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/GridStat/GridStat.conf

In METplus configuration files, the forecast and observation variables are referred to individually, including reference to both the NAMES and LEVELS.

 

FCST_VAR1_NAME = APCP
FCST_VAR1_LEVELS = A03

OBS_VAR1_NAME = APCP_03
OBS_VAR1_LEVELS = "(*,*)"

These relate to the following fields in the MET configuration file

fcst = {
  field = [
    {
     name = "APCP";
     level = [ "A03" ];
    }
  ];
}
obs = {
  field = [
    {
     name = "APCP_03";
     level = [ "(*,*)" ];
    }
  ];
}

Paths in GridStat.conf may reference other config options defined in a different configuration file. For example:

FCST_GRID_STAT_INPUT_DIR = {INPUT_BASE}/met_test/data/sample_fcst

where INPUT_BASE which is set in the tutorial.conf configuration file. METplus config variables can reference other config variables even if they are defined in a config file that is read afterwards.

 

  1. Run the use case:
run_metplus.py \
${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/GridStat/GridStat.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.OUTPUT_BASE=${METPLUS_TUTORIAL_DIR}/output/GridStat

METplus is finished running when control returns to your terminal console and you see the following text:

INFO: METplus has successfully finished running for user your_name.

 

  1. Review the output files:

You should have output files in the following directories:

ls ${METPLUS_TUTORIAL_DIR}/output/GridStat/met_tool_wrapper/GridStat/GridStat/2005080700
  • grid_stat_WRF_APCP_vs_MC_PCP_APCP_03_120000L_20050807_120000V_eclv.txt

  • grid_stat_WRF_APCP_vs_MC_PCP_APCP_03_120000L_20050807_120000V.stat

  • grid_stat_WRF_APCP_vs_MC_PCP_APCP_03_120000L_20050807_120000V_grad.txt

Take a look at some of the files to see what was generated.  Beyond the .stat file, the Economic Cost/Loss Value (eclv) and Gradient (grad) line types were also written to separate .txt files.  If you inspect ${METPLUS_BUILD_BASE}/parm/met_config/GridStatConfig_wrapped, you will notice that the ctc and cts line type settings are "STAT" while eclv and grad line types are set to "BOTH".

less ${METPLUS_TUTORIAL_DIR}/output/GridStat/met_tool_wrapper/GridStat/GridStat/2005080700/grid_stat_WRF_APCP_vs_MC_PCP_APCP_03_120000L_20050807_120000V.stat
  1. Review the log output:

Log files for this run are found in ${METPLUS_TUTORIAL_DIR}/output/GridStat/logs. The filename contains a timestamp of the current day.

ls -1 ${METPLUS_TUTORIAL_DIR}/output/GridStat/logs/metplus.log.*
  1. Review the Final Configuration File

The final configuration files are found in ${METPLUS_TUTORIAL_DIR}/output/GridStat. Like the log output, the final configuration file will contain a timestamp of the time that the METplus command was executed. These configuration files will contain all of the configuration variables used in the run.

ls -1 ${METPLUS_TUTORIAL_DIR}/output/GridStat/metplus_final.conf.*
admin Mon, 06/24/2019 - 16:07

End of Session 1 and Additional Exercises

End of Session 1 and Additional Exercises

End of Session 1

Congratulations! You have completed Session 1!

If you have extra time, you may want to try these additional MET exercises:

  • Run Gen-Vx-Mask to create a mask for Eastern or Western United States using the polyline files in the data/poly directory. Re-run Grid-Stat using the output of Gen-Vx-Mask.
  • Run Gen-Vx-Mask to exercise all the other masking types available.
  • Reconfigure and re-run Grid-Stat with the distance-map (dmap) dictionary defined, the dmap output line type enabled, and the distance_map flag is "TRUE" in the nc_pairs_flag dictionary.

If you have extra time, you may want to try these additional METplus exercises. The answers are found on the next page.

Instructions:

  1. Explore the types of model_applications available
ls ${METPLUS_BUILD_BASE}/parm/use_cases/model_applications/*

You will see many subdirectories with multiple files ending in .conf. The naming convention for these files are intended to provide enough meta-data to allow the user to identify an example to start from. The convention includes the [MET-Statistical-Tools]_fcstType_obsTypo_climatologyType_GeneralDescriptors_FileFormats.

e.g. ${METPLUS_BUILD_BASE}/parm/use_cases/model_applications/medium_range/GridStat_fcstGFS_obsGFS_climoNCEP_MultiField.conf
is running Grid-Stat on GFS forecasts and GFS analysis files and using NCEP climatology to compute statistics for multiple fields.
  1. Review the configuration file.

Note how the use of BOTH to specify the forecast field and observation/analysis field are configured the same. Also note how there are 4 fields specified at varying levels, which will result in evaluation of 10 unique fields.

less ${METPLUS_BUILD_BASE}/parm/use_cases/model_applications/medium_range/GridStat_fcstGFS_obsGFS_climoNCEP_MultiField.conf
BOTH_VAR1_NAME = TMP
BOTH_VAR1_LEVELS = P850, P500, P250
BOTH_VAR2_NAME = UGRD
BOTH_VAR2_LEVELS = P850, P500, P250
BOTH_VAR3_NAME = VGRD
BOTH_VAR3_LEVELS = P850, P500, P250
BOTH_VAR4_NAME = PRMSL
BOTH_VAR4_LEVELS = Z0
  1. Run run_metplus.py on this use-case
run_metplus.py \
${METPLUS_BUILD_BASE}/parm/use_cases/model_applications/medium_range/GridStat_fcstGFS_obsGFS_climoNCEP_MultiField.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.OUTPUT_BASE=${METPLUS_TUTORIAL_DIR}/output/GridStat_climo
  1. Inspect the output. Note metplus_final.conf indicates the subdirectories under ${METPLUS_TUTORIAL_DIR} where the data were written out.
ls ${METPLUS_TUTORIAL_DIR}/output/GridStat_climo

Instructions: Modify the METplus configuration files to add relative humidity (RH) at pressure levels 500 and 250 (P500 and P250) to the output.

  1. Copy the GridStat.conf configuration file and rename it to GridStat_add_rh.conf for this exercise.
cp ${METPLUS_BUILD_BASE}/parm/use_cases/model_applications/medium_range/GridStat_fcstGFS_obsGFS_climoNCEP_MultiField.conf ${METPLUS_TUTORIAL_DIR}/user_config/GridStat_add_rh.conf

 

  1. Open GridStat_add_rh.conf with an editor and add the extra information.
vi ${METPLUS_TUTORIAL_DIR}/user_config/GridStat_add_rh.conf

Hint: The variables that you need to add must go under the [config] section.

Hint: Since the RH data has no climatology, you must also add an additional output line type.  Both of the specified output line types (SAL1L2 and VAL1L2) require climatology.  An example of an additional output line type is to add the following:  GRID_STAT_OUTPUT_FLAG_SL1L2 = STAT

  1. Rerun METplus passing in your new custom config file for this exercise
run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/GridStat_add_rh.conf \
${METPLUS_TUTORIAL_DIR}/tutorial.conf \
config.OUTPUT_BASE=${METPLUS_TUTORIAL_DIR}/output/exercises/add_rh
You should see the relative humidity field appear in the log output from grid_stat.
ls -1 ${METPLUS_TUTORIAL_DIR}/output/exercises/add_rh/logs/metplus.log.*

Look for:

DEBUG 2: Processing RH/P500 versus RH/P500, for smoothing method NEAREST(1), over region FULL, using 10512 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P500 versus RH/P500, for smoothing method NEAREST(1), over region NHX, using 3600 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P500 versus RH/P500, for smoothing method NEAREST(1), over region SHX, using 3600 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P500 versus RH/P500, for smoothing method NEAREST(1), over region TRO, using 2448 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P500 versus RH/P500, for smoothing method NEAREST(1), over region PNA, using 1311 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 1: Regridding field RH/P250 to the verification grid.
DEBUG 1: Regridding field RH/P250 to the verification grid.
DEBUG 2:
DEBUG 2: --------------------------------------------------------------------------------
DEBUG 2:
DEBUG 2: Processing RH/P250 versus RH/P250, for smoothing method NEAREST(1), over region FULL, using 10512 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P250 versus RH/P250, for smoothing method NEAREST(1), over region NHX, using 3600 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P250 versus RH/P250, for smoothing method NEAREST(1), over region SHX, using 3600 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P250 versus RH/P250, for smoothing method NEAREST(1), over region TRO, using 2448 pairs.
DEBUG 2: Computing Scalar Partial Sums.
DEBUG 2: Processing RH/P250 versus RH/P250, for smoothing method NEAREST(1), over region PNA, using 1311 pairs.
DEBUG 2: Computing Scalar Partial Sums.
Navigate to the next page for the solution to see if you were right!

Instructions: Modify the METplus configuration files to change the logging settings to see what is available.

  1. Copy the tutorial.conf file to create a new custom configuration file and name it tutorial_logging.conf for this exercise.

cp ${METPLUS_TUTORIAL_DIR}/tutorial.conf ${METPLUS_TUTORIAL_DIR}/user_config/tutorial_logging.conf

Set OUTPUT_BASE to a new location so you can keep it separate from the other runs.

[config]
OUTPUT_BASE = {ENV[METPLUS_TUTORIAL_DIR]}/output/exercises/log_boost

The sections at the bottom of this page describe different logging configurations you can change. Play around with changing these settings and see how it affects the log output. You can refer to ${METPLUS_BUILD_BASE}/parm/metplus_config/defaults.conf to see all possible configurations that affect logging.

 

  1. Rerun the GridStat.conf use case passing in your new custom config file

run_metplus.py \
${METPLUS_BUILD_BASE}/parm/use_cases/met_tool_wrapper/GridStat/GridStat.conf \
${METPLUS_TUTORIAL_DIR}/user_config/tutorial_logging.conf

 

  1. Review the log output to see how things have changed from these settings

e.g. less ${METPLUS_TUTORIAL_DIR}/output/exercises/log_boost/logs/metplus.log.YYYYMMDDHHMMSS

 

For example, override LOG_METPLUS and add more text to the filename (or even use another METplus config variable).

Log Configurations

Separate METplus Logs from MET Logs

Setting [config] LOG_MET_OUTPUT_TO_METPLUS to no will create a separate log file for each MET application.

[config]
LOG_MET_OUTPUT_TO_METPLUS = no

For this use case, two log files will be created: metplus.log.YYYYMMDDHHMMSS and grid_stat.log.YYYYMMDDHHMMSS. If you don't see two files, make sure you put the LOG_MET_OUTPUT_TO_METPLUS setting AFTER a line with [config] on it.

Increase Log Output Level for MET Applications

Setting [config] LOG_MET_VERBOSITY to a number between 1 and 10 will change the logging level for the MET applications logs. Increasing the number results in more log output. The default value is 2.

[config]
LOG_MET_VERBOSITY = 3

You can also set [config] LOG_GRID_STAT_VERBOSITY to change the logging level for the GridStat log only. If set, the wrapper-specific value takes precedence over the generic LOG_MET_VERBOSITY value.

[config]
LOG_GRID_STAT_VERBOSITY = 5

Increase Log Output Level for METplus Wrappers

Setting [config] LOG_LEVEL will change the logging level for the METplus logs. Valid values are NOTSET, DEBUG, INFO, WARNING, ERROR, CRITICAL. Logs will contain all information of the desired logging level and higher. The default value is INFO.

[config]
LOG_LEVEL = DEBUG

When an error occurs, boosting the log level to DEBUG will provide you with more information to help resolve the issue.

Change Format of Time in Logfile Names

Setting LOG_TIMESTAMP_TEMPLATE to %Y%m%d will remove hours, minutes, and seconds from the log file time. The default value is %Y%m%d%H%M%S which results in the format YYYYMMDDHHMMSS.

[config]
LOG_TIMESTAMP_TEMPLATE = %Y%m%d

For this use case, the log files will have the format: metplus.log.YYYYMMDD

Use Time of Data Instead of Current Time

Setting LOG_TIMESTAMP_USE_DATATIME to yes will use the first time of your data instead of the current time.

[config]
LOG_TIMESTAMP_USE_DATATIME = yes

For this use case, INIT_BEG = 2005080700, so the log files will have the format: metplus.log.20050807 instead of using today's date (if LOG_TIMESTAMP_TEMPLATE = %Y%m%d)

Instructions: Review the list of file paths and configure a Example wrapper use case to loop over all of the files.

File Paths

  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f000
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f001
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f002
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f003
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f000
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f001
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f002
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f003
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f000
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f001
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f002
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f003
  1. Copy the Example_timing.conf you created in "Modify Timing Control in Example.conf" and modify it to set the appropriate settings.

cp ${METPLUS_TUTORIAL_DIR}/user_config/Example_timing.conf \
${METPLUS_TUTORIAL_DIR}/user_config/Example_exercise1.4.conf
vi ${METPLUS_TUTORIAL_DIR}/user_config/Example_exercise1.4.conf
Hints:

  • The use case should loop by initialization time (LOOP_BY = INIT).
  • Filename template tags, i.e. {init?fmt=%Y%m%d}, should not go in the EXAMPLE_INPUT_DIR. Alternatively, the entire path can be set in EXAMPLE_INPUT_TEMPLATE and exclude EXAMPLE_INPUT_DIR.
  • Identify the text that varies between file paths and replace them with filename template tags.
  • The forecast lead hour in the file names contain 3 digits. Use %3H as the format (fmt) to force at least 3 digits.
  • Refer to another use case configuration file or the METplus User's Guide if needed.
  1. Run the use case and verify that the results are as expected.

run_metplus.py \
${METPLUS_TUTORIAL_DIR}/user_config/Example_exercise1.4.conf \
${METPLUS_TUTORIAL_DIR}/user_config/tutorial.conf

Navigate to the next page for the solution to see if you were right!

admin Mon, 06/24/2019 - 16:08

Answers to Exercises from Session 1

Answers to Exercises from Session 1

Answers to Exercises from Session 1

These are the answers to the exercises from the previous page. Feel free to ask a MET representative if you have any questions!

Instructions: Modify the METplus configuration files to add relative humidity (RH) at pressure levels 500 and 250 (P500 and P250) to the output.

Answer: In the GridStat_add_rh.conf param file, add the following variables to the [config] section.

BOTH_VAR5_NAME = RH
BOTH_VAR5_LEVELS = P500, P250
GRID_STAT_OUTPUT_FLAG_SL1L2 = STAT

Instructions: Review the list of file paths and configure a Example wrapper use case to loop over all of the files.

File Paths (for reference)

  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f000
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f001
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f002
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/00/atmos/gfs.t00z.pgrb2.0p25.f003
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f000
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f001
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f002
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/06/atmos/gfs.t06z.pgrb2.0p25.f003
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f000
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f001
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f002
  • /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.20220112/12/atmos/gfs.t12z.pgrb2.0p25.f003

Answer:The configuration file you created should look something like this:

[config]

PROCESS_LIST = Example

LOOP_BY = INIT

INIT_TIME_FMT = %Y%m%d%H
INIT_BEG = 2022011200
INIT_END = 2022011212
INIT_INCREMENT = 6H

LEAD_SEQ = 0,1,2,3

EXAMPLE_INPUT_DIR = /scratch1/NCEPDEV/rstprod/com/gfs/prod
EXAMPLE_INPUT_TEMPLATE = gfs.{init?fmt=%Y%m%d}/{init?fmt=%H}/atmos/gfs.t{init?fmt=%H}z.pgrb2.0p25.f{lead?fmt=%3H}

The forecast lead sequence could use the begin_end_incr syntax to generate a list of values:
LEAD_SEQ = begin_end_incr(0,3,1)
The full path could be specified in the template variable instead of using the directory variable:
EXAMPLE_INPUT_TEMPLATE = /scratch1/NCEPDEV/rstprod/com/gfs/prod/gfs.{init?fmt=%Y%m%d}/{init?fmt=%H}/atmos/gfs.t{init?fmt=%H}z.pgrb2.0p25.f{lead?fmt=%3H}
cindyhg Tue, 06/25/2019 - 11:44