Transitions Newsletter Header

Issue 8 | Summer 2015

Lead Story

NITE: NWP Information Technology Environment

Over the years, the DTC has put in place several mechanisms to facilitate the use of operational models by the general community, mostly by supporting operational codes (for data assimilation, forecasting, postprocessing etc.) and organizing workshops and tutorials.

By stimulating the use of operational codes by the research community, composed of universities, NCAR, and government laboratories, several new NWP developments have been transitioned to NCEP operations. However, in spite of the relative success of the DTC, there are still significant gaps in the collaboration between the research and operational groups. The NITE project focuses on infrastructure design elements that can be used to facilitate this collaborative environment.

During the past year, the DTC received funding from NOAA to create a design for an infrastructure to facilitate development of NCEP numerical models by scientists both within and outside of EMC. Requirements for NITE are based on a survey of potential users and developers of NCEP models, information obtained during site visits to the NOAA Environmental Modeling Center, the UK Meteorological Office, and the European Centre for Medium-Range Weather Forecasting, discussions with focus groups, and reviews of various existing model development systems.

The NITE design has been developed with the following goals in mind: 

  • modeling experiments easier to run;
  • a single system available to NCEP and collaborators;
  • results relevant for R2O;
  • reproducibility and records of experiments; and
  • general to any NCEP modeling suite.

The following elements are included in the system design:

Data management and experiment database Scientists need access to input datasets (model and observations), a mechanism for storing selected output from all experiments, and tools for browsing, interrogating, subsetting, and easily retrieving data. To facilitate sharing information, key aspects of the experiment setup, such as provenance of source code and scripts, configuration files, and namelist parameters, need to be recorded in a searchable database.

“NWP Information Technology Environment (NITE): an infrastructure to facilitate development of NCEP numerical models.”

Source code management and build systems Source code repositories for all workflow components need to be available and accessible to the community. Fast, parallel build systems should be implemented to efficiently build all workflow components of a suite before experiments are conducted.

Suite definition and configuration tools All configurable aspects of a suite are abstracted to files that can be edited to create the experiments. Predefined suites are provided as a starting point for creating experiments, with scientists also having the option to compose their own suites.

Scripts The scripting is such that each workflow component (e.g., data assimilation) is associated with a single script, regardless of which suite is being run.

Workflow automation system The workflow automation system handles all job submission activity. Hence, the scripts used to run workflow components do not contain job submission commands.

Documentation and training Documentation and training on all workflow components and suites are readily available through electronic means.

In addition to the elements above, standardized tools for data visualization and forecast verification need to be available to all scientists.

Next steps for NITE:  Modernization of the modeling infrastructure at NCEP is very important for community involvement with all NCEP suites, and with the Next Generation Global Prediction System (NGGPS) in particular. The recommended implementation approach for NITE includes several phases, to minimize disruption to operational systems, and limit implementation costs, while providing useful, incremental capabilities that will encourage collaboration. Ongoing discussions between EMC and DTC, especially in the context of NGGPS infrastructure modernization, will likely lead to NITE implementation in the coming years.

See http://dtcenter.org/eval/NITE

NITE design a software infrastructure
Contributed by L. Carson and L. Bernardet

 


Director's Corner

Frederick Toepfer

The Next Generation Global Prediction System (NGGPS) project is a National Weather Service initiative to design, develop, and implement a next generation global prediction system to take advantage of evolving high performance computing architectures, to continue pushing toward higher resolutions needed to increase forecast accuracy at shorter timeframes, and to address growing service demands for increased skill at extended ranges (weeks to months). The NGGPS project goals are: (1.) expansion and implementation of critical weather forecasting research to operations (R2O) capabilities to accelerate the development and implementation of global weather prediction model advances and upgrades; (2.) continued improvement in data assimilation techniques; and (3.) improved software architecture and system engineering to support broad community interaction with the system. The introduction of Community Codes for all major components of the NGGPS and the establishment of a Global Modeling Test Bed (GMTB) to oversee community interaction with the codes are significant changes to the National Weather Service business model to advance numerical weather prediction skill in the US. Over the next five years, contributions from a wide sector of the numerical weather prediction community including NCEP, NOAA and other agency laboratories and private sector and universities, will be incorporated into an upgraded operational system to deliver a NGGPS that meets the evolving national prediction requirements.

Major work areas in the NGGPS project include selecting and further developing a new atmospheric dynamic core and improving model physics to better describe phenomenon at global to regional scales. Additional work will be conducted to accelerate development and implementation of weather prediction model components such as ocean, wave, sea ice, land surface and aerosol models, and improve coupling between these various components of the model system.

The DTC will play an important role in the NGGPS project. The new GMTB has been established as an extension of the current DTC. The GMTB is initially funded by the NGGPS project to assist in the development and testing of a Common Community Physics Package, as well as provide code management and support for an associated interoperable physics driver. The GMTB will also assist the NGGPS project in the development and testing of a sea ice model.

Active community participation in the development of the NGGPS is considered key to the success of the project. The DTC is perfectly positioned to assist in this aspect of the project. As the NGGPS Project Manager, I am excited about the DTC’s role, through the GMTB, in the project and anticipate a productive and successful relationship going forward.

Contributed by Fred Toepfer

 


Who's Who

Jeff Beck

Some people feel comfortable staying close to home, but Jeff Beck has been across the country and across the ocean, with more experience already than many people have in a lifetime.

Jeff received his B.S. in Meteorology from Penn State University, then moved on to Texas Tech to obtain his Master’s and Ph.D. in Atmospheric Science, completing his studies in 2009.

After a year working at a wind energy company in Austin, Texas, Jeff moved across the Atlantic, taking up a post-doctoral position at Meteo-France. “I knew some basic French at that point, and I thought it would be a good experience for me.”  The position lasted for three and a half years, when Jeff returned to the U.S. to work at NOAA’s Global Systems Division in September 2014.

Jeff quickly moved into duties with the Developmental Testbed Center. He is an integral part of the team that developed the NOAA Environmental Modeling System (NEMS) Nonhydrostatic Multiscale Model on the B-grid (NMMB) Tutorial that took place at the NCWPC in April 2015, taking on tasks ranging from managing the tutorial website and creating an NMMB User Guide, to testing the scripts and code for the tutorial’s practical session and helping with the practical on day 2 of the tutorial. Jeff is now the lead for the second NMMB tutorial coming in Spring 2016. Jeff also works on the North America Rapid Refresh Ensemble (NARRE) physics package, developing a stochastic suite that will over time outperform the current mixed-physics suite.

When he’s not busy with all his work tasks, Jeff enjoys the outdoors of Colorado, and spends time skiing, hiking and running. So if you happen to meet Jeff on the trails around the Foothills, be sure to say “Bonjour!”

 


Bridges to Operations

Data Assimilation Study for TC Intensity

The hybrid Ensemble Kalman Filter (EnKF)-Gridpoint Statistical Interpolation (GSI) data assimilation system was implemented at NCEP for its Global Forecasting System (GFS) in May 2012.

Schematic illustration of the hybrid EnKF-GSI data assimilation procedure. Dashed line indicates the optional re-centering step in the hybrid system.

This implementation led to significant improvements to global forecasts, including those of tropical storms. It can be noted that this improvement occurred while most current operational regional applications still use global rather than regional ensembles in their hybrid system. To bridge this gap, the DTC investigated the improvement of tropical storm intensity forecasts by using a regional ensemble in the GSI-hybrid data assimilation system.

A complete hybrid EnKF-GSI for the Hurricane WRF (HWRF) system was developed for the regional ensemble experiments, and results were compared to those obtained with the 2014 HWRF system. A two-way hybrid system was set up based on the GFS data assimilation scheme, using the GSI deterministic analysis to re-center the ensemble members at each analysis time. This re-centering step was found to reduce the ensemble spread for tropical cyclone (TC) center locations and intensity, so a one-way hybrid system that skipped the re-centering step was also developed.

Results showed that the operational system (Figure below, green) generated the lowest bias at the analysis time, but over time the bias showed a rapid “spin-down” from stronger to weaker wind forecasts than observed. (A similar spin-down issue was also noted using the 2015 HWRF system, but with smaller biases.)  The one-way hybrid system (red), which used a regional ensemble, performed better than the two-way hybrid system (blue), and also outperformed the 2014 operational configuration and the GSI hybrid system using GFS ensemble (without vortex initialization, cyan), for TC intensity forecasts beyond the 12-hour forecast lead time.

The DTC also performed experiments to further investigate the initial spin-down issue and found that it is related to an imbalance issue triggered by data assimilation. Experiments show that applying dynamic constraints could help ease such an imbalance. However, more research is required to find an optimal solution that reduces such imbalance-induced noise while still achieving desirable analysis increments.

Bias of (a) Maximum surface wind speed, and (b) Minimum sea level pressure for all the forecasts as a function of forecast lead time
Contributed by Hui Shao

 


Did you know?

Establishing a Functionally Similar Operational Environment for the Hourly-Updating NAM Forecast System

As a bridge between the research and operational NWP communities, one of the fundamental purposes of the DTC is to provide the research community access to functionally similar operational environments. Through this effort, promising new NWP techniques from the research community can be transferred more efficiently to an operational environment. One system that is currently under development and in the plans for operational implementation is the North American Mesoscale Rapid Refresh (NAMRR) system. The NAMRR is an hourly-updating version of the North American Mescoscale (NAM) forecast system, which is based on NOAA’s Environmental Modeling System (NEMS) Nonhydrostatic Multiscale Model on the B-grid (NMMB). In addition to the deterministic forecast guidance provided by NAMRR, it will also be a major contributor to the future Standard Resolution Ensemble Forecast (SREF) and High Resolution Ensemble Forecast (HREF) systems, along with the Rapid Refresh (RAP) and High Resolution Rapid Refresh (HRRR) systems, which are based on the Weather Research and Forecasting (WRF) model. The NAMRR system is being actively ported to the DTC. The configuration currently includes a 12-km North American parent domain and a 3-km CONUS nest, with plans to add a 3-km Alaska nest (Figure center of page). Future plans include providing support to the research community to run a functionally similar NAMRR environment.

Establishing a Functionally Similar Operational Environment for the Hourly-Updating NAM Forecast System
Contributed by Jamie Wolff

 


PROUD Awards

John Opatz, Associate Scientist III, RAL/NCAR |

John Opatz is an Associate Scientist III in RAL at NCAR contributing his expertise to two DTC projects: the Enhanced Model Evaluation Tools (METplus) and METplus training.

John is highly skilled and the go-to scientist for reviewing Pull Requests across METplus repositories. He made 210 METplus GitHub contributions in 2022, defining new issues, committing code changes, updating documentation, and reviewing the work of others. John’s Pull Request reviews are detailed, thoughtful, and comprehensive. He often expands his testing beyond the narrow scope of the original issue to assess how the code would handle a variety of user inputs. He is agile in his ability to switch between roles and appreciates the perspectives of software developers, users, and scientists.

John coordinated the online tutorial updates for METplus v5.0 and led the formation of a METplus user advisory group in 2023. John played a significant role in developing valuable content for the METplus 2021 Training Series. He leads various METplus projects aimed at fostering engagement with NOAA/EMC, NOAA/CPC, and NCAR/MMM. He works closely with engineers to develop new METplus use cases to broaden their impact and application. John’s growth in responsibilities within a short span of time is commendable.

His leadership in the areas of customer engagement, community support, and use case development are commendable. He conducts himself in a respectful, professional manner. He actively listens to others and genuinely works to improve their understanding and use of the METplus products. 

,
Gerard Ketefian, Research Scientist III CU/CIRES and NOAA GSL |

Gerard Ketefian works for CU/CIRES at NOAA GSL and contributes to four DTC projects: Unified Forecast System (UFS) Short-Range Weather Application (SRW) software support & community engagement, DTC activities in support of community involvement with the UFS, Agile Rapid Refresh Forecast System (RRFS) prototype Testing and Evaluation (T&E), and Optimizing Ensemble Design for Use in the RRFS.

Gerard has taken the lead in the submission of a massive METplus-related contribution to the UFS SRW Application repository to revamp the verification piece in the SRW workflow. Gerard has taken initiative to learn the nuanced capabilities of METplus and how it aligns with the workflow. This demands the flexibility to wear both science and technology hats. These improvements, including the capability to time-lag ensembles, will not only benefit internal DTC teams, but will also benefit the community at large. While this work has been quite the endeavor, Gerard has accomplished this with an infectious positive attitude, and has gladly incorporated feedback from other team members.

As a versatile staff member, Gerard has continuously embraced new challenges and opportunities within the DTC. Any project would benefit from his extensive knowledge and energetic work ethic. It is a pleasure to have Gerard as a team member in the DTC.