The Next Generation Global Prediction System

Summer 2018

The Next Generation Global Prediction System (NGGPS) multi-year Strategic Implementation Plan (SIP) connects federal and academic experts in the numerical weather prediction community to support the end-to-end functionality of the National Centers for Environmental Prediction (NCEP) Production Suite. The goal of the SIP is to help evolve the NGGPS into a Unified Forecast System (UFS) for operations and research.

  • The SIP governance includes UFS Working Groups who represent the essential science, technical, and design capacity of the UFS and span the community of expertise needed to support it.

  • UFS Working Groups consist of subject matter experts across NOAA line offices and laboratories, Navy, NASA, NCAR, and Universities.

  • The SIP Working Groups are always seeking members from the research and development community.

For more information on the SIP process, please visit: https://www.weather.gov/sti/stimodeling_nggps_implementation

 

About Software containers

Spring 2018

Porting is the ability to move software from one environment to another by making minimal changes to the existing code. Unfortunately, such an arrangement is not easy to achieve when it comes to porting software from one Operating System (OS) platform to another or even among versions of a single OS.  Software containers are a solution to this problem. The software and its working environment are able to be ported from platform to platform, from a laptop to a desktop or HPC environment as a unit, single and complete. The container bundles the software package, configuration files, software libraries and other binaries needed to run the software. Simply put, when you containerize a package and its dependencies, a portable platform is created. The differences in infrastructure and variations in OS releases are basically eliminated.


A VM is called a virtual machine, because that's what it is. It's a server that exists only in the virtual world, enabled by the hypervisor software. A container is called a container because it steals from the idea of shipping containers. Credit for image and caption, https://www.linkedin.com/pulse/3-drastic-reasons-containers-causing-seismic-shift-mark-shavers

A single column model (SCM) can be an easy, quick, and cheap way to test new or updated physics schemes

Autumn 2017

A SCM replaces advection from a dynamical core with forcing that approximates how the atmospheric column state changes due to large-scale horizontal winds. An atmospheric physics suite then calculates impacts to radiation, convection, microphysics, vertical diffusion and other physical processes as the forcing alters the column state.

The SCM approach is conceptually simple, extremely quick to run (less than a minute on a laptop), and makes interpretation of results less ambiguous because it eliminates three-dimensional dynamical core feedbacks. It can also be relatively straightforward to compare how different physics respond to identical forcing and to perhaps provide evidence or justification for more expensive three-dimensional modeling tests.

The DTC’s Global Model Test Bed (GMTB) project built an SCM on top of the operational Global Forecast System (GFS) physics suite and used it as part of a physics test harness. It can be considered the simplest tier within a hierarchy of physics testing methods. Recently, it has been used to compare how the operational GFS suite performs compared to one with an advanced convective parameterization for simulations of maritime and continental deep convection.

The SCM code is available to collaborators on NOAA's VLab, and will be updated periodically to keep pace with changes in the operational FV3-GFS model. Additionally, as the Common Community Physics Package comes online in the near future, the SCM will be compatible with all physics within that framework.

Join Weekly Webinar Discussion on EMC’s Forecast Systems

Winter 2017

You can join weekly webinars to discuss the performance of EMC’s forecast/analysis systems from a synoptic and mesoscale perspective.

  • The weekly webinar is led by EMC’s Model Evaluation Group and participants include model developers, NCEP service centers, NWS regional and field offices, DTC staff, academic community and private sector.
  • Webinars serve as a forum for EMC to reach out to forecast and user community

Interested in participating?  Contact Glenn White or Geoff Manikin

Here's a conference abstract of Glenn's on this group too
https://ams.confex.com/ams/97Annual/webprogram/Paper313180.html

Documentation is important!

Summer 2017

A Google Search:

Documentation is the Most Valuable Thing You Do (from cyborginstitute.org/projects/administration/documentation/).

Good documentation makes software and tools more valuable and more durable, and although documentation can be a great expense, in the long term it’s almost always worthwhile.


The schematic diagram representing the GFS Physics calling structure.

As part of its effort to set up a framework for a Common Community Physics Package (CCPP), the DTC’s Global Model Test Bed is documenting the Global Forecast System’s operational physics suite using “Doxygen” software.

Doxygen is [a] tool for generating documentation from popular programming languages include Python, Fortran, C, C++, IDL

This software parses specially formatted in-line comments and creates navigable and searchable web documents.

For the GFS physics suite, documentation is written within Fortran code, so is relatively easy to keep up-to-date as new developments are added to the model.

www.doxygen.org/


GFS Operational Physics Documentation

Did you know there are suggested topics for Visitor Projects that receive special consideration?

Autumn 2016
  • Advance the forecast skill of the DTC-supported HWRF modeling system through improved physics and/or initialization

  • Advance the analysis capability of the DTC-supported Gridpoint Statistical Interpolation and/or the NOAA Ensemble Kalman Filter (EnKF) Data Assimilation systems through development, testing, and evaluation of advanced data assimilation techniques and the addition of new data types or measurements

  • Transition innovations in atmospheric physical parameterizations to NOAA’s Next-Generation Global Prediction System (NGGPS)

  • Adding new capabilities to the Model Evaluation Tools

For more information and to apply, go to http://www.dtcenter.org/visitors/

Research and Operational Communities Gathered for Physics Workshop Recently

Winter 2015

A successful workshop on Parameterization of Moist Processes for Next-Generation Weather Prediction Models was hosted by NOAA and DTC at the NOAA Center for Weather & Climate Prediction (NCWPC) in College Park, MD, Jan 27-29, 2015. A large number of participants from NOAA, the international operational community, and the research community gathered to discuss topics including microphysics, sub-grid scale clouds and turbulence, and deep convection. The first day of the workshop included two keynote presentations and several foundational presentations on the state-of-the-science and current operational status at NCEP for the three topic areas. The second day consisted of breakout discussions allowing for in-depth conversation and idea sharing. A plenary wrap-up session was held on the morning of the third day. A list of the participants, along with the agenda and links to the presentations are available on the workshop website at: 

http://www.dtcenter.org/events/workshops15/moist_phys/


Workshop attendees for the workshop on Parameterization of Moist Processes for Next-Generation Weather Prediction Models

Establishing a Functionally Similar Operational Environment for the Hourly-Updating NAM Forecast System

Summer 2015

As a bridge between the research and operational NWP communities, one of the fundamental purposes of the DTC is to provide the research community access to functionally similar operational environments. Through this effort, promising new NWP techniques from the research community can be transferred more efficiently to an operational environment. One system that is currently under development and in the plans for operational implementation is the North American Mesoscale Rapid Refresh (NAMRR) system. The NAMRR is an hourly-updating version of the North American Mescoscale (NAM) forecast system, which is based on NOAA’s Environmental Modeling System (NEMS) Nonhydrostatic Multiscale Model on the B-grid (NMMB). In addition to the deterministic forecast guidance provided by NAMRR, it will also be a major contributor to the future Standard Resolution Ensemble Forecast (SREF) and High Resolution Ensemble Forecast (HREF) systems, along with the Rapid Refresh (RAP) and High Resolution Rapid Refresh (HRRR) systems, which are based on the Weather Research and Forecasting (WRF) model. The NAMRR system is being actively ported to the DTC. The configuration currently includes a 12-km North American parent domain and a 3-km CONUS nest, with plans to add a 3-km Alaska nest (Figure center of page). Future plans include providing support to the research community to run a functionally similar NAMRR environment.


Establishing a Functionally Similar Operational Environment for the Hourly-Updating NAM Forecast System

NOAA TESTBEDS & THE DTC

Winter 2014

During California field exercises of the Hydrometeorological Testbed (HMT), a key objective has been to improve longer-range forecasts of so-called “atmospheric rivers” or ARs (narrow streams of mid- to low-level moisture) and other meteorological patterns that produce very heavy rainfall. During efforts to evaluate model forecasts for these exercises the DTC has explored methods that can provide more meaningful verification than standard scores. One such method represents regions of, say, precipitation in model forecasts and observed fields as spatial objects and then quantitatively compares attributes of these objects such as size, location, geographical overlap, etc. Since the landfall of moisture on the Western U.S. coastline is a key factor in AR forecasts, a novel approach for this project has been to define objects within thin domains that follow the coastline (as in the figure), and to choose actual moisture transport as a basis for the fields from which to define objects. The narrow coastline-hugging domain allows the MODE (the Method for Object-Based Diagnostic Evaluation) evaluation to focus on actual landfall of moisture, a key factor in the effort to forecast severe precipitation in California and other regions vulnerable to ARs.

NOAA TESTBEDS & THE DTC

Spring 2013

For several winter seasons, the DTC has worked with the Hydrometeorology Testbed (HMT) to develop effective verification techniques for ensemble forecasts of heavy winter precipitation associated with atmospheric rivers in California. For example, the performance diagram below displays the impact of model resolution. See additional HMT information and links on the DTC website. See http://www.dtcenter. org/eval/hmt/2012/.



Did you know....

Summer 2013

Through its Visitor Program, the DTC is currently working with Adam Clark, a scientist at the Cooperative Institute for Mesoscale Meteorological Studies (CIMMS), on a project that involves using the Method for Object-based Diagnostic Evaluation – Time Domain (MODE-TD) for identification, tracking, and visualization of simulated supercells in high-resolution models, which will be applied and tested during annual NOAA/Hazardous Weather Testbed Spring Forecasting Experiments.



Supercells are identified using updraft helicity (UH) extracted at 5-minute intervals from an experimental 4-km grid-spacing version of the WRF model run in real-time at the National Severe Storms Laboratory (NSSL), which is known as the NSSL-WRF. The UH extraction is done using a newly developed technique that minimizes data volume, and supercells are defined based on maximum intensity and longevity criteria applied to UH objects identified by MODE-TD. Geographic information and object attribute information are then combined into the GeoJSON file format and displayed using an experimental web interface developed in collaboration with Chris Karstens of CIMMS/NSSL, known as the NSSL Experimental Data Explorer. The image above is a screenshot from the data explorer showing the path as depicted by UH and associated attributes of a simulated supercell over central Oklahoma 19 May 2013.

TESTBEDS AND THE DTC

Autumn 2013

The Tropical Cyclone Modeling Team (TCMT) was formed as part RAL’s Joint Numerical Testbed (JNT) in 2009 to help assess hurricane and tropical storm forecasts from experimental models. As such, its members interact with the DTC in two particular ways: by designing methods and products appropriate for tropical cyclone verification that can be installed in maintained software at the DTC, and by providing both real-time and retrospective performance measures for each years’ hurricane forecasts. At a working level, members of the TCMT are often contributing DTC members as well.

The intent of the yearly retrospective evaluations are to provide guidance to the National Hurricane Center as they choose particular experimental forecast models to use for operational guidance during the upcoming hurricane season. In recent years these retrospective studies have focused on hurricane track and intensity forecasts from suites of comparison models forwarded by universities, research laboratories, and national centers. This evaluation is intended to help achieve the goals of NOAA’s Hurricane Forecast Improvement Project (HFIP), a program in which the DTC hurricane task is also involved.

 



The accompanying figure, to the right, illustrates some results from the 2013 Retrospective Exercise (covering hurricane seasons 2010-2012). In the figure, the rank of a single experimental model hurricane intensity forecast is shown relative to that of the three top performing models. As a general rule, while hurricane tracking has dramatically improved in recent years, better intensity forecasts remain elusive.