News | Community Connections

Community Connections

Expansion of MODE applications

Contributed by Louisa Nance (DTC Director and NSF NCAR) based on information provided by DTC staff and the broader community
Autumn 2024

The Method for Object-based Diagnostic Evaluation, commonly referred to as MODE, was originally developed to address the skill of forecasts of localized episodic phenomena such as rainfall (Davis et al 2006). The inclusion of this technique in the initial release of the Model Evaluation Tools (MET), the verification package developed and supported by the DTC, in 2008 has led to the exploration of an ever broadening application space for this useful tool. In 2009, the DTC collaborated with the NOAA Hydrometeorology Testbed (HMT) to explore the application of MODE to Atmospheric Rivers by applying MODE to observed and forecasted fields of integrated water vapor (IWV). Collaborations with HMT, as well as the NOAA Hazardous Weather Testbed, also explored the application of MODE to ensemble forecasts.  From there, MODE has been used to examine the spatial and temporal characteristics of cloud cover forecasts from high-resolution NWP models with a novel approach used by Mittermaier and Bullock (2013) where they actually tracked cloud breaks instead of the clouds themselves.  Applications related to clouds and precipitation have expanded over the years to include assessments of the Global Synthetic Weather Radar product and tropical cyclone quantitative precipitation forecasts (Newman et al 2023, Newman et al 2024) by the DTC, using infrared brightness temperatures to identify clouds (Griffin et al 2021), and an application to hail identification (Adams-Selin et al 2023).

While MODE was originally developed to address the needs for evaluating high-resolution NWP forecasts, its application has been expanded to include climate models. In particular, characterizing the location and predictability of the ITCZ in the Community Earth System Model Large Ensemble (CESM-LE), as well as temperature and precipitation anomalies associated with ENSO. In addition to expansions related to applications to the atmosphere, MODE has been applied to sea ice forecasts and more recently to the chlorophyll-a bloom season (Mittermaier et al 2021) and marine heatwaves (Cohen et al 2024). While most of the studies mentioned above involved collaborations with MET team members, its application to marine heatwaves was discovered through a recent workshop presentation. Jacob Cohen indicated he chose MODE for his research because of its flexibility, its widespread use over many applications, and its helpful documentation and user support.


Example of MODE application to marine heat waves. Top panel shows snapshot of sea surface temperature anomaly and bottom panel shows objects corresponding to the sea surface temperature anomalies (from Cohen et al 2024)

Advancing the capabilities of MODE has continued since its inception to include the capability to track objects through time, referred to as MODE Time Domain or MODE-TD, and the capability to define objects based on multiple fields, referred to as Multivariate MODE.  MODE-TD is being used to assess sea ice location and timing and fire spread, while Multivariate MODE is being applied to identify drylines, regions that meet red flag criteria based on relative humidity and winds and blizzard-like objects based on precipitation type, 10-m winds and visibility.

The applications described above should not be considered an exhaustive list given this information was gathered by soliciting input from DTC staff.  The breadth of information gathered based on our internal knowledge suggests that a thorough literature review would likely produce a broader list of applications!

 

METplus Community Contributes Valuable Feedback For Improving Online Documentation

Contributed by John Opatz and Tara Jenson
Spring 2024

In response to feedback from the DTC’s Science Advisory Board, a METplus focus group was formed from respondents to an email request to METplus community members. They were tasked with increasing the accessibility of METplus across its user base, with a special focus on first-time users and those who may be discovering METplus through academic avenues. Input was gathered from focus-group members via questionnaires, surveys, group discussions, and practice exercises using real-world atmospheric data. These results were then used to assess where best to enhance and improve METplus’ documentation and runtime messages.

Focus-group activities were divided into two phases. The first phase, which began on April 3rd 2023 and ran for six weeks, asked focus-group members to provide feedback based on weekly activities designed to explore one or more specific areas of METplus documentation. The results from this phase were synthesized and collated into 20 action items. These items targeted the METplus user guides for each component of the software system, as well as the METplus online tutorial and training videos. The work entailed for each action item varied, from the restructuring of the METplus online tutorial, to an informational shift from general applications to more technical aspects accompanied by better graphics, to a new MET User’s Guide top-level organizational approach for expediting how quickly users find the information they are looking for.  The METplus team worked on 10 of the 20 action items before proceeding to the second phase of the focus group. These 10 action items were chosen based on how strongly they were supported in the questionnaire responses, as well as their feasibility given the timeframe between phases.


Given the user-defined priority list, the METplus team was able to focus on action items ranked higher on the list, thus optimizing the enhancement time.

The second phase, which began on September 6th 2023, convened a subset of the same focus-group members who reviewed the improvements and updates made to address the 10 selected action items.  This approach enabled the feedback to be more targeted and gathered over a shorter time period.  Phase two results led to two significant findings.  The first was a priority list for the 20 action items created by asking each focus member to identify top priorities and combining these individual rankings into an overall combined ranking.  Given the user-defined priority list, the METplus team was able to focus initial work on those action items ranked higher on the list, thus optimizing the enhancement time.  The second finding was the significance of a focus group in which the users could participate.  Questionnaire responses indicated that users felt the development of METplus documentation through the targeted focus group was highly beneficial and would be well received by the METplus community in the future.

Ultimately, several of the documentation improvements guided by the focus group input will be available in the MET User’s Guide version 12.0.0, which is expected to be released this summer.  These improvements include the simplified, rewritten self-installation instructions for using the compile_MET_all.sh script (MET’s current recommended installation method) and new additions of a Docker and Apptainer (formerly called Singularity) installation method. Improvements stemming from the valuable focus group input also extend to the current METplus online tutorial, where a new Statistics 101 tutorial and a session on preparing your programming environment have been added.

Community Use of Common Community Physics Package Single-column Model

Contributed by Weiwei Li
Winter 2024

The Common Community Physics Package (CCPP) single-column model (SCM) is developed and supported by the Developmental Testbed Center (DTC). In addition to periodic public releases, the CCPP SCM and its applications were introduced to the community in 2020 through an AMS Short Course “Experimentation and Development of Physical Parameterizations for Numerical Weather Prediction Using a Single-Column Model and the Common Community Physics Package” and a series of workshops and conferences. The DTC has been using the CCPP SCM to facilitate the Unified Forecast System (UFS) physics developments, testing and evaluations, which have been highlighted in previous DTC newsletters. Beyond the numerous applications of the CCPP SCM at the DTC, applications span from physics development, process-level understanding, to participating in model inter-comparison projects (MIPs; Table 1).

For years, the UFS community has vigorously used the CCPP SCM for physics development. In the development of the Grell–Freitas convection scheme, the CCPP SCM simulations demonstrated the value of using beta functions to characterize the features associated with three convection modes. For the scale-aware Turbulent Kinetic Energy eddy-diffusivity mass-flux (TKE-EDMF) scheme, the CCPP SCM helped identify the impact of mixing-length formulations and constraints on parameterizing boundary-layer turbulence (Fig. 1). In addition to conventional column physics, the CCPP SCM was also applied to facilitate process-level investigations on such as the Noah-MP Land Surface Model, the direct radiative effects of aerosols, land-atmosphere interactions, and microphysics in the Arctic systems. The CCPP SCM was also adopted to conduct idealized simulations to examine the impacts of deep convective downdrafts and re-evaporation of convective rain on the tropical mean state and variability.


Besides the UFS, the broad community has started exploring the merits of this tool. Recently, the CCPP SCM has been used as a teaching tool by Professor Cristiana Stan of George Mason University, where graduate students can dive into the complexity of an Earth system model. To support Navy model development, it helped test the implementation of aerosol-radiation interaction for the marine atmospheric boundary layer. The NSF NCAR, through a Department of Energy project, used the CCPP SCM and observations to gain advanced boundary layer understanding and parameterizations that are currently used by multiple host models, including the UFS and the NSF NCAR models. The State University of New York at Albany used it to study shallow convective systems in trades and the direct radiative impact of Saharan dust aerosols on African easterly waves. The tool was also adopted in a few MIPs, including evaluating radiation fog in the Local and Non-local Fog Experiment (LANFEX) field campaign and the Cold-Air Outbreaks in the Marine Boundary Layer Experiment (COMBLE) MIP. This effort will help improve the physical representations of various aspects of mixed-phase cloud systems. 

As the capabilities of the CCPP SCM flourish, it is foreseeable that the tool will facilitate more robust physics developments and scientific investigations, which would ultimately benefit the Earth system modeling community. At the same time, it is worth keeping in mind that users need to fully understand the processes to be examined when using a SCM, given its semi-idealized nature.


Table 1 List of researchers, institutes, and areas with publications/presentations for using the CCPP SCM beyond the DTC.

Researchers

Institutes 

Areas

Saulo Freitas, Georg Grell, and Haiqin Li

NASA & NOAA/GSL

Cumulus physics development for RAP, HRRR and RRFS; Freita et al. (2021)

Edward Strobach

NOAA/EMC

PBL physics development in GFS; Strobach (2022)

Weizhong Zheng, Michael Barlage, and Helin Wei 

NOAA/EMC

Noah-MP Land Surface Model development for the UFS; Zheng et al. (2021)

Anning Cheng and Fanglin Yang

NOAA/EMC

Radiative effects of aerosols in the UFS; Cheng and Yang (2022)

Siwei He

Montana State University 

Land-atmosphere interactions in the UFS; He et al. (2022) 

Amy Solomon 

NOAA/PSL

Microphysics associated with forecast of Arctic systems for the UFS  (personal correspondence)

Sasa Gabersek

NRL

Aerosol-radiation interaction; Gabersek et al. (2024)

I-Kuan Hu

NOAA/PSL

Testing and evaluation of cumulus physics (personal correspondence); Hu et al. (2024)

Lulin Xue and Weiwei Li

NSF NCAR

PBL physics in the UFS and the NCAR models; Xue et al. (2023)

Xia Sun

NOAA/GSL

Evaluating CAPE bias in the UFS; Sun et al. (2023) 

Christian Wilder Barreto-Schuler

Univ. at Albany - SUNY

Direct radiative impact of Saharan dust aerosols on African easterly waves; Barreto-Schuler et al. (2024)

Jian-wen Bao and Evelyn Grell

NOAA/PSL

LANFEX MIP; Boutle et al. (2022)

Weiwei Li and Lulin Xue

NCAR 

COMBLE MIP; Juliano et al. (2022)

The Common Community Physics Package (CCPP) Visioning Workshop

Contributed by Ligia Bernardet and Grant Firl
Autumn 2023

The Common Community Physics Package (CCPP) Visioning Workshop, which took place virtually on August 15-17 of 2023, convened discussions focused on the current status and future direction of the CCPP. The workshop was organized by a multi-institutional committee composed of representatives from DTC (Grant Firl, Lulin Xue, Dustin Swales, Ligia Bernardet), NCAR’s Mesoscale and Microscale Meteorology (Laura Fowler) and Climate and Global Dynamics (Courtney Peverley) Laboratories, University of Oklahoma Center for Analysis and Prediction of Storms (Ming Xue), and NOAA’s Environmental Modeling Center (Fanglin Yang). Participants from four NCAR labs and six NOAA labs and centers comprised approximately two-thirds of the participants. The remaining participants were from the U.S. Department of Energy Pacific Northwest National Laboratory, the Joint Center for Satellite Data Assimilation, the United States Naval Research Laboratory, the Brazilian National Institute for Space Research, the Stevens Institute of Technology, the University of Maryland, the Central University of Rajasthan, and the Norwegian Meteorological Institute.


The growth of the CCPP community, the ongoing progress in Earth System Model science, and the advancements in computational technology demand continuous CCPP development.

Workshop participants recognized that the CCPP is now a mature product, used operationally at NOAA as part of the Hurricane Analysis and Forecast System v1, slated for all future Unified Forecast System (UFS) application implementations, and in different stages of integration within models under the purview of the U.S. Navy Research Laboratory and the National Center for Atmospheric Research. The growth of the CCPP community, the ongoing progress in Earth System Model science, and the advancements in computational technology demand continuous CCPP development.

High-priority items identified by participants that are needed to facilitate exchange and enable collaborative development are: unification of the CCPP Framework development being conducted at NOAA and at NCAR, generalization of existing parameterizations for higher interoperability, and development of a vision for how the multiple sets of CCPP-compliant physics will be managed and served to the community.

Additional recommendations drawn from the workshop can be classified in two categories. The first one is the improvement of existing practices. Highlights include changes in the directory structure of the CCPP Physics repository for ease of use, issuing additional tags to record important code snapshots (such as those associated with UFS prototypes for upcoming operational implementations), establishing a formal governance for the repository of CCPP Standard Names, and documenting the limitations of schemes and suites regarding the scales and processes they were developed for and tested on.

The second category is development needed to be prepared for the future. Highlights include continuing engagement with the aerosol and chemistry community to devise optimal solutions for the interface between atmospheric chemistry and host models, looking ahead toward three-dimensional physics, developing a module for common atmospheric physics functions (such as saturation vapor pressure) to ensure greater consistency amongst schemes within a suite, and ability for all schemes to return tendencies to enable studies in physics-dynamics coupling. On the computational front, the use of fine-grain platforms with graphical processing unit (GPU) compute architectures has become a priority. 

In summary, the cross-institutional engagement with the CCPP and the openness of the community to collaborate on physics development were evident in this workshop. Next steps include the submission of a meeting summary to the Bulletin of the American Meteorological Society and the prioritization of development topics.

METplus Training Expanded

Contributed by Tara Jensen, Julie Prestopnik, and John Opatz
Summer 2023

The METplus team launched part one of the METplus Advanced Training Series this past spring, which focused on Prototypes in the Cloud, Subseasonal to Seasonal (S2S) Diagnostics, and Coupled Model Components.  The intent of the METplus Advanced Training Series was to extend beyond the Basic Training Series, held during the winter and spring of 2021-2022. The series will resume with Part Two on Wednesday, October 4th, 2023, and will focus on evaluating Fire Weather forecasts and the Seasonal Forecast System (SFS). It will also cover Python Embedding, Ensembles, and Use of Climatologies. The five remaining sessions will run virtually on Wednesday mornings every other week beginning at 9:00 am MST for two hours. Those interested in participating can visit the Registration page to sign up for the fall sessions.

In a related activity, the METplus team has been working on significant improvements to the Online Tutorial and connections to the Users’ Guides. A focus group that convened during Spring 2023 provided many great suggestions for how to make the METplus Support and Training more effective. Over the summer, John Opatz, Brianne Nelson, and Barbara Brown worked tirelessly to execute the most impactful suggestions. They are currently conducting a second focus group to determine whether their effort resulted in marked improvements.  Please watch for a roll-out of our updated online training and documentation in the fall.

A Forward-looking Virtual Get-together of the CCPP Community

Contributed by Grant Firl
Spring 2023

As the Common Community Physics Package (CCPP) heads toward operational status within NOAA’s numerical weather prediction (NWP) advancement efforts, the community that helped make it a reality will gather virtually alongside the “CCPP-curious” for the CCPP Visioning Workshop on August 15-17 to discuss plans for its continued improvement over the next 5-10 years. A strong emphasis for the workshop will be placed on accommodating the software coupling needs of the next generation of physics parameterizations. A second emphasis will be placed on discussing the need for additional features to improve ease of development within the framework, the capability to take advantage of fine-grain parallelism within large portions of physics suites, and any changes the current implementation requires to be able to support the needs of the atmospheric composition and chemistry communities, among other advancing ideas. These topics are expected to comprise the majority of the agenda for the latter two days of the workshop, while the first day of the workshop will focus on conveying the current status and capabilities, code management practices, and known issues of the software package. It is hoped that the first part of the workshop can teach the fundamentals to those who are unaware of how the system works and the development process therein, while the final two days will serve as a platform for physics developers, model developers interested in physics-dynamics coupling, code managers and computational scientists to plan the more advanced, future-looking topics.


The promise of the CCPP – better scientific exchange and understanding within the atmospheric physics community, and the opportunity for the community to more effectively cooperate toward the goal of improving weather forecasts.

Just like the CCPP software, the workshop is intended to be “model-agnostic” and participants working with models such as the Unified Forecast System (UFS), the CCPP Single Column Model (SCM), the Navy Environmental Prediction sysTem Utilizing the NUMA corE (NEPTUNE), the Community Earth System Model (CESM) Community Atmosphere Model-System for Integrated Modeling of the Atmosphere (CAM-SIMA), the Weather Research and Forecasting (WRF) model, the Model for Prediction Across Scales (MPAS) and others are highly encouraged to participate. Also, if there is enough interest in a particular model’s CCPP implementation (ascertained via a pre-workshop survey), breakout groups for model-specific discussions will be considered. A broad range of modeling expertise will help to foster a more productive discussion of best practices for interoperability and collaborative development for model physics, a key tenet of the CCPP. The modeling diversity is reflected by the workshop organizing committee, whose members represent the DTC, NOAA GSL, NOAA EMC, NCAR CGD, NCAR MMM, and academia.


Parameterizations have been contributed by a number of scientists from various organizations, creating an ecosystem in which the CCPP can be used not only by operational centers to produce operational forecasts, but also by the research community to conduct investigation and development.

The goals of the CCPP Visioning Workshop are at least two-fold. First, participants should come away with a common understanding of the current state of the CCPP, how to engage in the development process, and how to find help via documentation or direct questions to CCPP-aligned staff. The second goal is to create a prioritized list of advancements to the CCPP software framework required to keep pace with advancing scientific and technological frontiers over the next 5-10 years. Such a list, together with the continued hard work from DTC staff, community scientists and engineers, and broad organizational support to address its items, should go a long way toward delivering on the original promise of the CCPP – better scientific exchange and understanding within the atmospheric physics community, and the opportunity for the community to more effectively cooperate toward the goal of improving weather forecasts.

To learn more about the workshop and registration process, visit the CCPP Visioning Workshop website.

For more news related to CCPP, see the Lead Story article: CCPP Goes Operational.

JEDI projects adopts and contributes to CCPP variable naming standard

Contributed by Steven Vahl (UCAR and JCSDA) and Dominikus Heinzeller (UCAR and JCSDA)
Winter 2023

In September of 2022, the JCSDA (Joint Center for Satellite Data Assimilation) officially adopted the CCPP standard names, originally developed for use with the Common Community Physics Package, as the model variable names to be used within the JEDI (Joint Effort for Data assimilation Integration) software.

The JEDI software is employed by many Earth observing systems and requires agreed-upon names to be used for the quantities being input and computed. It is critical that these names are understood identically between systems to prevent code errors, misuse, and duplication. Within the JEDI software, variables are used in two different broad contexts: as variables for Earth observations, and as variables for different Earth system models. Earlier in 2022, a team led by Greg Thompson at JCSDA developed a naming standard for the JEDI observation variables. This team surveyed the available existing variable naming standards and found that none of them were adequate for JEDI needs, and so they developed a new observation variable naming standard, name-by-name.


It is critical that these model variable names are understood identically between systems to prevent code errors, misuse, and duplication.

Later, Steven Vahl (JCSDA) was tasked with organizing the task to develop or adopt a naming standard for the JEDI model variables. Dom Heinzeller, formerly part of the CCPP development team, brought the variable naming standard that had been developed for use with the CCPP to the attention of the team for consideration. Since there were no other known viable model variable naming standards, the decision came down to either adopt the CCPP standard, or extend the naming standard begun by the JEDI Observation team. A meeting of the JEDI community was held to discuss the proposal of adopting the CCPP naming standard. The advantages of adopting it were 1) it already contained some of the needed model variable names, 2) it had a list of rules for creating new names, and 3) it had a Github-based pull request process for adding names and rules that would hopefully minimize the number of meetings needed. The primary disadvantage of adopting the CCPP naming standard was that the names for some quantities would be different from the standard name for the same quantity being used for JEDI observation variables. Ultimately it was decided that within JEDI software there was only one place where these two kinds of variables would be used close to one another, and even there, the context in which the variable was being used (observation or model variable) would be clear, and so the appropriate naming standard for the context could be applied. It should be noted that there is a conceptual difference in how the CCPP standard names are used within JEDI. While CCPP makes use of the standard names only in the metadata tables, JEDI uses them directly in the code and configuration files.

Recently the first few JEDI model variable names were added to the CCPP naming standard via a pull request. More such pull requests will be coming in 2023, and also the new standardized names will be adopted universally inside JEDI code.


XKCD.com by Randal Munroe (https://xkcd.com/927/)

Continuous integration for community engagement in CCPP Physics code management

Contributed by Lulin Xue (NCAR)
Autumn 2022

Continuous integration (CI) is a software development practice where developers regularly merge their code changes into a central repository whereby automated builds and tests are run. The key goals of continuous integration are to quickly find and address bugs, improve software quality, and reduce the time it takes to validate and release new software updates. The DTC Common Community Physics Package (CCPP) team has not only adopted CI for CCPP Physics code updates and releases, but also applied the CI concept to community engagement and support. The CCPP team established bi-weekly CCPP Physics code management meetings in August 2021. Representatives of organizations, NOAA, NCAR, Naval Research Laboratory (NRL), and DTC, regularly involved in the development of  CCPP actively participate in the meetings to discuss issues, develop solutions, and evaluate outcomes in a productive collaborative forum.


The concept of CI is useful in software development and is very helpful in connecting communities. The DTC CCPP team will continue to engage the broader community through different platforms and opportunities.

One successful example of the CI for community engagement in CCPP Physics code management is the establishment of a code fork referred to as the UFS Fork of CCPP Physics. As more groups started using and developing CCPP-compliant physics and chemistry, an effective community collaboration methodology on CCPP Physics has become increasingly important. After collecting community input in late 2020, the DTC CCPP team developed a proposal for the CCPP Physics repository structure and code management plan in June 2021. The proposal includes a code repository structure consisting of the authoritative CCPP Physics main branch maintained by the CCPP team and multiple CCPP Physics forks for different model systems maintained by corresponding modeling centers or developers. For the individual physics scheme, a similar structure will be adopted, in which individual physics scheme developers are responsible for authoritative physics repositories, and modeling centers are responsible for their fork/branches. The proposal has been discussed and improved iteratively at the CCPP Physics code management meetings for a year and the proposed recommendations were implemented in October 2022. The DTC CCPP team and EMC are currently co-managing the UFS Fork of CCPP Physics.

CI for community input and engagement in CCPP Physics code management has led to many important outcomes and initiatives in addition to the UFS Fork of CCPP Physics. The engagement of the NCAR Mesoscale & Microscale Meteorology (MMM) model development team led to a new DTC project to test the CCPP compliance of NCAR MMM physics suite in the CCPP Single Column Model (SCM). The NRL team identified a backward incompatibility issue for their Navy Environmental Prediction sysTem Utilizing the NUMA corE (NEPTUNE) model when CCPP Physics updates required corresponding changes in the host model this summer. The DTC CCPP team actively worked with the NRL team to propose a solution for this problem using CCPP Framework functionality and the CI capability offered by GitHub. NCEP EMC representatives recently raised the question of how to simplify and improve the CCPP interstitial schemes (​​modularized pieces of code that perform data preparation, diagnostics, or other "glue" functions, and allow primary schemes to work together as a suite), which was echoed among all groups. The DTC CCPP team built an inventory of all existing interstitial schemes and started to classify them based on their host model specificity. Proposed pathways addressing this problem were discussed and refined at multiple meetings. This ongoing effort is expected to lead to improvement in the interoperability of CCPP Physics.

The concept of CI is useful in software development and is very helpful in connecting communities. The DTC CCPP team will continue to engage the broader community through different platforms and opportunities. The planned CCPP visioning workshop in 2023 will include a topic on how to better engage and support the community in using and developing CCPP. 

International Community Participates in First METplus Users’ Workshop

Contributed by John Opatz, Tara Jensen, Keith Searight
Summer 2022

The first DTC METplus Users Workshop was held virtually June 27th through June 29th, 2022 covering a multitude of topics and application areas. METplus became an operational software tool for NOAA NCEP Central Operations in 2021 and provides a framework for all components needed to provide ease-of-use and diagnostic capability to the Model Evaluation Tools (MET) package first released by the DTC in 2008, which originally began through and continues to benefit from the generous support of the Air Force. The last MET Workshop was hosted in 2010 and was focused on the core tools housed within MET. Much has changed since then and the committee developed the workshop agenda with goals of building the METplus community and inspiring external contributions to the development of the verification system, with special attention paid to planning future verification and diagnostic frameworks. The workshop garnered tremendous interest with 250 registered participants. Users were actively involved during the workshop, with presentations on ways they have used METplus in forecast verification and diagnostic activities. Thirty METplus community members from across the globe represented various NOAA centers (EMC, CPC), the UK Met Office, India’s National Centre for Medium Range Weather Forecasting (NCMRWF), the Australian Bureau of Meteorology, as well NCAR’s AAP and JNT, ICDP (UCAR’s International Capacity Development Program), and NOAA GSL. METplus community members from these groups gave 15- to 20-minute presentations about various METplus tools and product capabilities.

Users of all experience levels were engaged, with the METplus development team providing top-level overviews of the verification system on the opening day and condensing the newest capabilities of METplus in easy-to-understand presentations. Parallel sessions were conducted throughout the workshop to maximize the community presentation time, enabling attendees to select the sessions they were most interested in. For those interested in seeing all of the sessions, recordings of the workshop are available on the Workshop website and the presentation slides are available from the Workshop Google Drive folder.

The workshop also went beyond presentations by engaging all participants directly through the use of online surveys during session breaks to collect and address accessibility issues and outline opportunities for success in future releases of the verification suite. The second and third days of the workshop provided attendees with the unique opportunity to receive one-on-one assistance via a virtual METplus help desk hosted by a METplus team member.


2022 METplus Workshop attendees by affiliation

 

The final day of the workshop culminated in a sneak-peek at upcoming and in-development tasks that will be coming in the next METplus Coordinated Release (version 5.0.0), as well as two rounds of small breakout groups focused on five topics in which attendees provided feedback on current activities and future development. The METplus development team was grateful for the opportunity to engage with such a large part of the verification community and will continue to find ways to build engagement with the community and strengthen the METplus suite to advance its mission.

The GeneralIzed Aerosol/chemistry iNTerface (GIANT)

Workshop and Hackathon

Contributed by Ligia Bernardet (NOAA GSL and DTC), Natalie Mahowald (Cornell University), and Alma Hodzic (NCAR ACOM)
Spring 2022

An Earth System Model (ESM) is a model that represents various domains of the earth, such as atmosphere, ocean, sea ice, etc. A new community effort is underway to devise a method to transfer information easily and consistently between chemistry and aerosol modules, and other components of ESMs. Currently, it is difficult to study and quantify feedbacks of chemistry/aerosol parameterizations on weather and climate because of the structurally diverse ways in which chemistry/aerosols modules are integrated in ESMs. This complexity interferes with testing the aerosol/chemistry independently from the meteorology. Furthermore, the plethora of different structures used in the ESMs to connect the aerosol/chemistry modules makes it impossible to swap the modules between ESMs, making it difficult to isolate the uncertainties from the aerosol/chemistry module versus the atmospheric model. Improving the interoperability of aerosol/chemistry modules opens the door to addressing important uncertainties in ESM predictions, such as aerosol-cloud interactions, direct aerosol radiative effects, and climate impacts on air quality.

To find avenues for increased interoperability and collaborative research, the virtual Generalized Aerosol/Chemistry Interface Workshop was held on February 16, 2022. The organizing committee was international and multi-institutional: Natalie Mahowald (Cornell University), Alma Hodzic (NCAR), Ligia Bernardet (NOAA Global Systems Laboratory and DTC), Pete Bosler (Sandia National Laboratory), Tom Clune (NASA), Matt Dawson (NCAR), Barron Henderson (Environmental Protection Agency - EPA), Jeff Johnson (Cohere Consulting, LLC), Xiaohong Liu (Texas A&M University), Po-Lun Ma (Pacific Northwest National Laboratory, Naval Research Laboratory), Benjamin Murphy (EPA), Nicole Reimer (University of Illinois), and Michael Schulz (Meteorologisk institutt Norway).

The 76 workshop participants represented NOAA, many US National Laboratories (NCAR, Joint Center for Satellite Data Assimilation, NASA, Jet Propulsion Laboratory, DOE Pacific Northwest National Laboratory, Sandia National Laboratory), US Universities (California Irvine, Columbia, Cornell, Miami, Michigan, Texas A&M, New York SUNY), and international institutions (Brazil, China, Finland, France, Germany, Poland, and the United Kingdom), see chart for distribution. During the three-hour event, participants worked in breakout groups to discuss the complexities and requirements of specific aerosol/chemistry-related processes, such as the interactions of aerosols with radiation and microphysics, data assimilation, emissions, removal, diagnostics, and many more.


Affiliation of the Feb 2022 workshop participants.

As a follow up to the workshop, several virtual hackathons were held. A hackathon is a design sprint-format event, in which software engineers and scientists involved in software development collaborate intensively on software projects. This event was conducted to examine the practical aspects of building an interface that can be used across multiple ESMs.

Two hackathon events took place on April 29 and May 6, 2022. During the week between the two events, the hackathon participants worked on the problem individually (as opposed to synchronously, when they all came together virtually as a group). The first hackathon in the series, was focused on building an interface for computing aerosol optical properties and their feedbacks on radiation, a relatively straightforward, but key process for estimating aerosol impacts on weather and climate. Hackathon organizers supplied a single-dimensional (box) host model, dubbed a driver, which was used and modified by the participants to call their aerosol optics code and force it with prescribed aerosol descriptions (properties such as composition and size). This first step was to identify the required elements of an aerosol interface definition. Future work will incorporate chemical, microphysical, land, ocean, and other processes. The intention is that participants will represent various modeling centers and community ESMs, and will work on the interoperability of their aerosol optics code with a generic host model. A third hackathon event is planned for May 20.

The intended goal of this hackathon, along with additional activities planned for the upcoming months, was to further inspire the community to develop a new interface (tentatively called GeneralIzed Aerosol/chemistry iNTerface - GIANT), which will support forward-looking studies involving the interaction of aerosols/chemistry with weather/climate, and facilitate the exchange of chemistry/aerosols models used in ESMs.

UFS SRW Application Training: The Wrap Up

Contributed by Michael Kavulich, Jr
Winter 2022

Supporting the development of the Unified Forecast System (UFS) has been a central focus of the DTC for several years now. Many staff have concentrated on support, development, and training of this new state-of-the-art system that bridges the gap between the research and operational communities. In September 2021, we took another significant step in this process when the DTC hosted the first training event for the UFS Short-Range Weather (SRW) Application.

The UFS SRW Application Training Event was held 20–24 September 2021 and was attended virtually by more than 100 students, scientists, software engineers, and other professionals from more than a dozen time zones around the world. The goal was to teach those in attendance not only how to build, run, and customize the App, but also impart the overall vision of the UFS and invite them to provide code and scientific enhancements to the UFS community.


Screenshot of some attendees and lecturers from the UFS SRW Training Event

 

Over the course of the five-day, Monday through Friday training event, the attendees were treated to a wide suite of lectures from subject-matter experts representing the DTC staff and various NOAA labs and centers, including the Environmental Modeling Center (EMC), Geophysical Fluid Dynamics Laboratory (GFDL), and Global Systems Laboratory (GSL). On the first day, the overall vision of the UFS SRW Application in research and operations was explained, before delving into the basics of installing, configuring, and running the Application. More details about the pre-processing utilities were covered on day 2, along with several in-depth lectures on the atmospheric model and its dynamical core. Day 3 included details about prerequisite libraries, using different physics packages, post-processing and graphics/visualization. More in-depth topics were introduced on day 4, particularly geared towards scientific users and code developers seeking to make changes to the application and potentially contribute those changes back to the community code. Finally, Friday included presentations on future development from operational and research groups in EMC and GSL. 

 


UFS SRW Training Attendees displayed by organizational type

 

In addition to lectures, hands-on sessions were conducted each day for a subset of the attendees to practice the installation and use of the application using the NCAR Cheyenne supercomputer. On days 4 and 5, open forums were held for discussion and questions, exploring topics such as computational performance, customization, and visions for future capabilities and adding new components to the system.

The training event may be over, but you can still benefit from the plentiful training materials. Videos of the lectures have been posted online on the agenda page, as well as slides. We invite anyone interested in the UFS SRW Application to take advantage of these materials, as well as other resources such as the Users Guide, Support Forums, and GitHub page.

 

Introducing the Unified Forecast System Case Studies Platform

Contributed by Xia Sun, Dominikus Heinzeller, and Ligia Bernardet
Autumn 2021

The Unified Forecast System (UFS) Case Studies platform offers resources for conducting case studies that represent the evolving forecast challenges of NOAA’s operational Global Forecast System (GFS). This platform is the outcome of a DTC project funded by the 2019 Disaster Related Appropriation Supplemental: Improving Forecasting and Assimilation (DRAS 19 IFAA), also known as NOAA’s Hurricane Supplemental. The platform’s overarching goal is to facilitate the development of model physics innovations and their transitions to operations by exercising the Hierarchical Testing Framework (HTF) coupled with the Common Community Physics Package (CCPP). This project is developing the CCPP capability to output physics tendencies that supports the overall goal. And, leveraging newly developed CCPP capabilities and the CCPP Single Column Model (SCM) to investigate forecast issues using the UFS Case Studies catalog demonstrating to the community how to utilize this platform. 

Case studies are an integral component of the HTF as they serve as an entry point for model innovations to be carefully exercised before they can be considered for comprehensive tests that use larger samples. The current set of ten cases covers atmospheric phenomena such as local temperature inversions, hurricanes, winter storms, extreme temperature events, and summertime convection. These cases illustrate forecast issues identified for GFS v16, which has been operational since March 2021. The case catalog was established in collaboration with both the Model Evaluation Group (MEG) at NOAA Environmental Modeling Center (EMC) and colleagues at NOAA Global Systems Laboratory (GSL). 


Snapshot of the landing page of the UFS Case Studies Platform

The newly established UFS Case Studies platform includes initial condition (IC) datasets housed on Amazon S3 cloud storage, model configurations, simulation results including the general synoptic patterns and particular meteorological fields that elucidate model issues, and a suite of data-visualization example scripts using Python. Cases included in the UFS Case Studies Platform have been incorporated in the physics testing procedure in the UFS Research to Operation (R2O) Project, which is a multi-year effort that convened NOAA and non-NOAA scientists to work on the operational model. 

This platform is designed to serve both the research community and model developers. For researchers, the platform provides resources using the public releases of the UFS Medium-Range Weather (MRW) Application (App) and UFS Short-Range Weather (SRW) App. The  SRW App and MRW App target weather predictions on timescales of two weeks and several days, and on limited-area and global domains, respectively. For developers, the platform helps ensure the advanced physics schemes implemented in the next version of the operational model outperform earlier ones. The platform provides information and verification results for the ongoing model development using codes hosted in official UFS GitHub repositories. It is possible to test alternative physics suites composed of different sets of physics parameterizations in addition to the supported suites from the UFS public releases. The platform provides information and links for interested readers with the desired guidance. 


Snapshot of the case page for 2018 Hurricane Michael

In summary, the UFS Case Studies Platform bridges the gap between research and operational communities with resources that foster and advance model development. The platform will continue to evolve as work towards GFS version17 progresses. The UFS Case Studies Team encourages questions, comments, and feedback from users. These can be posted under the Discussions section of the ufs-case-studies GitHub repository.

DTC Workshop on Integrating Cloud and Container Technologies into University NWP Curriculum

The Recap

Contributed by Kate Fossell, with Michelle Harrold, Mike Kavulich, John Halley Gotway, and Jamie Wolff
Summer 2021

The DTC convened a live, virtual three-day workshop 7-9 June 2021 tailored toward university faculty interested in integrating software container and cloud technologies into new or existing Numerical Weather Prediction (NWP) curricula. Of the nearly 40 interested registrants from over 25 different institutions, approximately 16 active participants attended the three-day event.  The majority of attendees were either professors of NWP courses or interested in teaching NWP in the future, along with a few industry and researcher participants.  

The workshop was inspired by recent partnerships with university programs to incorporate a containerized end-to-end NWP system online tutorial, recently established by the DTC, into their course. The success of these partnerships prompted an eagerness to solicit further input from faculty to enhance and refine the system, based on the needs and constraints of university professors. The goals of the workshop were two-fold; first, to inform the target audience about the tools made available by the DTC and how these technologies have been used in university classrooms; and second, to solicit feedback and facilitate dialogue with faculty about the existing content and how it may be incorporated into their curriculum.

The workshop held a variety of session types to provide a broad spectrum of information and experiences to encourage engagement and thoughtful discussions. The workshop kicked off with an overview of the NWP systems established by the DTC, after which  participants were given access to Amazon Web Services (AWS) to practice the online tutorial and become familiar with the containerized NWP system. Additional demonstrations and practice sessions were provided later in the workshop as well, including a focus on modifying and customizing the system. DTC invited two of the participants, Thomas Guinn from Embry Riddle Aeronautical University and John Mejia from the Desert Research Institute, to share an overview of their current NWP coursework to set the stage for opportunities and challenges that exist in current curriculum approaches.  Following these presentations, Sam Ng from Metropolitan State University - Denver and John Allen from Central Michigan University gave testimonial talks about their recent experience working with the DTC to incorporate the end-to-end containerized NWP system into their classes, which provided practical examples and experiences to seed further discussion.  Additionally, a representative from AWS presented information on the AWS Academy as an opportunity for faculty to train in AWS resources.   


DTC Workshop Diagram

At the core of the workshop were the breakout group discussions in which participants had the opportunity to think critically about the feasibility of using this content in their classes, share concerns or limitations, potential solutions, and pose questions to the DTC to consider for future improvements and enhancements. The feedback was quite positive and encouraging; overall, participants realized the benefit of containers and felt the integration into the classroom curriculum can foster excitement for NWP! Some of the specific takeaways and suggestions expressed were :

  • Broad interest in using AWS or cloud computing for NWP components, imbuing students with new technical skills (e.g., cloud computing, containers) applicable to non-meteorology jobs as well;
  • Need for more options to better align the detail of NWP components, containers, and cloud computing to the appropriate course level and skill sets;
  • Desire to coordinate with the DTC to implement NWP containers in class curricula, including getting started with AWS and providing guest lectures;
  • Requests for more tutorials, instructional videos, etc., as well as expand case types; 
  • Recognition of potential hurdles, including heterogeneity of operating systems (e.g. Mac, Linux, Windows) used by students and accompanying user instructions, spin-up time for instructors and students to learn about cloud computing and NWP, and costs associated with cloud computing;
  • Excitement for containerized MET and METviewer for easier verification; and
  • Interest in participating in an AMS or AGU education conference or short course

For anyone interested in learning more about this effort or potential collaboration, please reach out to Kate Fossell at fossell@ucar.edu. The DTC is looking to connect with interested parties to continue the conversations begun at the workshop and facilitate new collaborations.  Those who express interest will be invited  to a newly established Slack workspace.  Workshop presentations are now available on the workshop website.


Syllabus 700 level

DTC Embarks on International Project to Provide Information about Model Physics Uncertainty

Contributed by Michael Ek (DTC, NCAR/RAL/JNT), Ligia Bernardet (DTC, NOAA/GSL)
Spring 2021

The “Model Uncertainty Model Intercomparison Project” (MUMIP) is an international effort to better understand model-physics uncertainty, and how to represent it in stochastic physical parameterizations. After all, physical parameterizations provide an approximate solution to physical processes occurring in a grid-box and are, as such, a source of forecast model uncertainty due to a large variety of factors, e.g. unresolved subgrid-scale variability treated as a grid-box mean, unknown parameter values, physical processes which have been excluded, structural errors, incomplete calculations of processes or inherent process uncertainty.

MUMIP is a joint project of the WMO Working Group on Numerical Experimentation (WGNE) and the World Weather Research Program (WWRP) Working Group on Probability, Dynamics and Ensemble Forecasting (PDEF), working groups dedicated to the development of Earth system models for use in weather, climate, water and environmental prediction on all time scales, and diagnosing and resolving model shortcomings and uncertainties. Scientists from a number of national and international centers, including University of Oxford, University of Reading, the UK MetOffice, the European Centre for Medium-Range Weather Forecasts (ECMWF), Météo-France, Deutscher Wetterdienst (DWD, German Weather Service), NOAA’s Physical Sciences Laboratory (PSL), NCAR’s  Mesoscale and Microscale Meteorology Laboratory (MMM), and now the Developmental Testbed Center (DTC), will be contributing to this project.

While stochastic physics schemes are often tuned using ad-hoc methods, objective methods derived from physical constraints can be used to better inform the development and improvement of schemes, which is the focus of the MUMIP project. The uncertainty in parameterizations may be addressed by stochastic methods, which aim to select a random state consistent with the resolved state. The objective methods in MUMIP then inform the development of deterministic and stochastic schemes, i.e. comparing state variables and tendencies in a convection-permitting high-resolution model simulation against a lower-resolution parameterized-convection model simulation. This is done by “coarse-graining” a high-resolution simulation (i.e. computing spatio-temporal averages) onto a grid of a lower-resolution simulation.  The premise here is when the parameterizations work perfectly, the statistics of the state variables in the coarse-grained higher-resolution simulation should match those of parameterized lower-resolution simulations. In reality, however, discrepancies are often discovered when performing this type of comparison, where such discrepancies can then be used to improve the physical parameterizations. Additionally, the high-resolution distribution offers useful information about the subgrid-scale uncertainty that helps to objectively inform stochastic parameterizations.

MUMIP participants will run an array of approximately 40,000 Single Column Model (SCM) simulations forced by coarse-grained high-resolution model output (Figure 1), initially from the DWD ICON (3-km) model. In what is planned to be a 3-year project, the DTC will use the Common Community Physics Package (CCPP) SCM, as well as a coarse-grained 3-km NOAA Unified Forecast System (UFS) simulation. In order to use this forcing data, the SCMs will ingest forcing fields using the DEPHY format, a standard agreed upon at the International Workshop for SCM/LES comparisons organized and hosted by Météo-France in June 2020. The use of the DEPHY format, which has already been implemented in the CCPP SCM, is key for complementary initiatives towards improvement and tuning of model physics.  In April 2021, DTC staff participated in the workshop on the Improvement and Calibration of Clouds in Models organized by Météo-France, where the focus was to discuss and share the latest improvements in parameterizations. In summary, there are a number of ongoing multi-institutional initiatives related to SCMs and their role in hierarchical model development, and the DTC is taking advantage of these collaborations to pursue improvements in model physics.


 

Figure: High-resolution model output (small grids) is coarse grained and mapped to grid (large boxes) to provide column forcing to drive an array of CCPP SCMs.

 

Building a Community through UFS Medium-Range Weather Application Users’ Training

Contributed by Jamie Wolff
Winter 2021

The DTC, in cooperation with subject-matter experts from NOAA's Environmental Modeling Center (EMC) and Geophysical Fluid Dynamics Laboratory (GFDL), as well as NCAR’s Climate and Global Dynamics (CGD) Laboratory hosted a live, virtual training session for the Unified Forecast System (UFS) Medium-Range Weather (MRW) Application 4-6 November and 9 November 2020. 

The UFS MRW Application targets predictions of global atmospheric behavior out to about two weeks. This training was designed to teach community users how to set up and run the latest officially released UFS MRW Application (version 1.1) for their own experiments.  The training comprised a wide range of sessions taught by highly experienced experts and developers in the field. Live lectures were presented by experts on the various UFS components, including the CIME-based workflow, the Finite-Volume Cubed-Sphere (FV3) dynamic core and physics suite options, and pre/post-processing. In addition to lectures, live virtual practice sessions were hosted to broaden experience with building, running, and modifying the system to take full advantage of the supported capabilities for research and forecasting.  Throughout the training, participants were able to interact directly with the SMEs to gain a deeper understanding of the system and how to configure it for their purposes.  The final day provided an optional "deeper dive" for developers that covered advanced subjects, including code modification, domain configuration, and repository management protocols.  The slides from these presentations as well as recordings, are available on the DTC website.

A total of 34 participants registered for the event, representing eight different time zones! Upwards of 54 participants took part in some sessions, with the average being 40-45 attendees including instructors. We had enthusiastic participation in all of the lectures, and the instructors were able to provide prompt and detailed answers to questions raised during the practical sessions using Google Meet and Slack.

Although a virtual meeting was not the preferred method for teaching the material, it nevertheless was a great success, according to the participants. Quotes from the feedback included,  "I've really appreciated the practicals and the rapid feedback in the Slack channels." and "I personally enjoyed this training very much and learned a lot. The materials really helped me consolidate some of my knowledge and skill for different components and functionalities in the UFS apps."


'Build diagram' discussed at the Unified Forecast System Medium-Range Weather Application Users’ Training

Introducing Undergraduate Students to NWP by Using Software Containers

Contributed by Sam Ng
Autumn 2020

Jamie Wolff, the lead of the software containers and cloud computing task in the DTC, contacted me in the Spring of 2019 about a unique opportunity to collaborate on a course that would offer interested students at Metropolitan State University an opportunity to run and experiment with an end-to-end numerical weather prediction (NWP) system utilizing cloud computing.

The motivation behind this collaboration was to introduce undergraduates of a bachelor’s degree Meteorology program to NWP in a stress-free environment in which students would not spend unnecessary time configuring, compiling, and optimizing the code and libraries for the NWP system.  The DTC containers would streamline running the NWP system so the students would be able to grasp how beneficial NWP could be in their toolkit. The overarching goal was to empower the students to use and understand NWP models on a fundamental level and prepare them for post-undergraduate positions in the NWP arena.

It was decided that an active learning approach would be the ideal format to present the information and disseminate the material taught as a series of workshops/seminars.

During the 2019 Fall semester, I taught a Forecasting Lab course that met once a week for 2 hours.  The extended class time allowed the DTC team to teach the attendees and demonstrate how to compile and run the Weather Research and Forecasting (WRF) Model on the Amazon Web Service (AWS) Cloud Server. The DTC team met with my students for a total of eight hours (six face-to-face hours and two hours via Google Meet).

The initial design of the WRF Cloud Computing (WCC) course was somewhat challenging because the setup and configuration of the AWS on our UNIX-based workstations faced some computing and financial hurdles. Nevertheless, the DTC team created Docker container images for each component of the end-to-end WRF-based NWP system on the AWS Cloud. Docker makes it easier to create, deploy, and run applications by using “containers.” A container is a software tool that packages code and all of its dependencies, so the application runs quickly and reliably from one computing environment to another (Docker Website). The end-to-end system containers were built and configured by the DTC team ahead of classes. With the accessibility of the components in containers, it was possible to efficiently and effectively create a full end-to-end NWP system teaching toolkit for use by the students.

Next, the DTC team provided step-by-step instructions on how to use those images to access the WRF data and ecosystem to run several case studies through the end-to-end NWP system from pre-processing, to running the model, then post-processing, visualization, and verification.  The DTC team also recommended installing Docker on the MSU Denver Weather Lab network so the students could run the end-to-end NWP container system locally on our in-house desktops as well.


Was the class a success? Well, here are a few student quotes that endorsed its success.

“I thought the NCAR instructors were knowledgeable, patient, and helpful…”

“Was awesome to have the hands-on help”

“I enjoyed the class very much and want to express gratitude to all those from NCAR and Dr Ng for their time and patience with us (or at least me) in learning something foreign and more difficult to grasp.”


Metropolitan State University classroom Forecasting Lab course.

UFS Users’ Workshop

Contributed by Jeff Beck and Weiwei Li
Summer 2020

The first Unified Forecast System (UFS) Users’ Workshop, held on 27-29 July 2020, convened a broad cross-section of the community, despite transitioning from the originally planned in-person format to a virtual format. Organized by a diverse committee from NOAA Global Systems Laboratory (GSL), NOAA Physical Science Laboratory (PSL), NOAA Environmental Modeling Center (EMC), NOAA National Weather Service (NWS), the National Center for Atmospheric Research (NCAR), and George Mason University, the workshop presenters gave one hundred and ten talks to nearly five hundred attendees who registered for the event. While NOAA had a strong presence at the workshop, over 46% of the participants were not affiliated with NOAA, with a strong showing from the academic community (approximately 25%) representing over 44 different academic institutions.  The private sector was the third largest contingent comprising almost 10% of the registrants.  NOAA participants were almost equally split between the NWS and NOAA Research.  The participants and presenters also represented diverse expertise spanning weather, climate, hydrology, oceanography, space weather, modeling and High Performance Computing.  

The workshop kicked off with introductory remarks from Dr. Neil Jacobs, the Assistant Secretary of Commerce for Environmental Observation and Prediction, and an overview of the UFS was presented by the co-leads of the UFS Steering Committee, Dr. Ricky Rood of the University of Michigan and Dr. Hendrik Tolman, the NWS Senior Advisor for Advanced Modeling Systems.  The introductory session was rounded out by presentations by the leads for the six UFS application teams: Medium-Range Weather (MRW) and Subseasonal-to-Seasonal (S2S), Short-Range Weather (SRW), Hurricane, Space Weather, Coastal, and Air Quality.  The remainder of the workshop was a mix of plenary and parallel sessions focused on six topic areas: 

  • UFS Updates, Cloud Computing, Infrastructure, and Computational Performance
  • Model Dynamics, Physics, and Air Quality
  • Data Assimilation, Ensembles, and Predictability
  • Regional Configurations and Extremes: Development and Applications
  • Verification, Evaluation, and Post Processing
  • Earth-System Modeling (Land/Hydrology, Ocean, Sea Ice, Space Weather, Cryosphere)


The participants of the first UFS Users’ Workshop enjoyed lively engagement and discussions, despite the virtual format.  To facilitate exchanges that would normally take place over coffee breaks, lunch, or a reception, the organizing committee set up Slack channels where workshop participants were encouraged to post their questions and comments to the speakers, and dialogue was encouraged to continue on this platform following each presentation.  Each general topic area was issued a Slack channel to aid in navigating this communication forum and foster dialogue during the parallel sessions.  

Find Out More About UFS


Slack, a channel-based messaging platform, used for the first UFS Users’ Workshop to aid in navigating this communication forum and foster dialogue during the parallel sessions.

Communication and Outreach in the Unified Forecast System

Contributed by Cecelia DeLuca
Spring 2020

The Unified Forecast System (UFS) is a coupled, comprehensive Earth modeling and data assimilation system that will be used in NOAA operations and by the research community. There are naturally a lot of questions about UFS from potential collaborators, for example: What UFS codes are available to run? How do I find out more information about the project, and how can I get involved?

One of the roles of the UFS Communication and Outreach Working Group (C&O WG, for short) is to communicate the answers to such questions. Members include physical scientists, social scientists, NOAA public affairs representatives, managers, and software developers. The C&O WG also established a “UFS Focus Group,” which is a diverse collection of 50+ people, populated by graduate students, scientists, field officers, and others, who have volunteered their time to review and test UFS products.

The first product generated by the C&O WG was the UFS Portal, at ufscommunity.org. Launched last year following a Focus Group review, the Portal is a one-stop-shop for all things UFS: news, upcoming events, highlights of current activities, documents, and plans. Looking for answers to the questions above? Information about the March 2020 release of the UFS, the  Medium-Range Weather Application 1.0, is available on Portal along with a link to support forums. There is an overview of all UFS applications (e.g. short-range weather, sub-seasonal to seasonal prediction, space weather), a description of the UFS governance structure and working groups, and information about how to get involved.

The role of the C&O WG extends beyond the Portal to more general aspects of UFS communications. Making sure that UFS participants have a space where they can work together easily is critical, which is why the C&O team collaborated with the NOAA Environmental Modeling Center (EMC) and their UFS partners to set up a GitHub organization, repositories, and wikis for UFS applications. Making the code available, testing it, and sharing documentation are key aspects of community participation. The ufs-community organization on GitHub, at https://github.com/ufs-community/ufs/wiki, is how UFS releases are being distributed.

The C&O WG is also working with community members to understand and document the usability of UFS code. To do this, the C&O WG has engaged with UFS software developers to prepare and distribute  “Graduate Student Tests” or GSTs. The UFS project defines the GST as the ability of a student to easily get code, run code, change code, test code for correct operation, and compare and evaluate results. The most recent GST was released with the Medium-Range Weather Application, and it is open to all - not just graduate students!  Evaluators get, build, and run an example, change a physics parameter, rerun, and visually compare results, in less than six hours. Then they fill out a questionnaire about their experience. Results will be shared on the Portal and the feedback used to improve future releases. Like other C&O WG activities, the GST encourages communication that will help to advance UFS scientifically and computationally.

Interested in participating in a GST? See https://ufscommunity.org/index.html#/science/gst

Container & CCPP AMS Short Courses

Contributed by Kate Fossell and Grant Firl
Autumn 2019

One of the primary goals of the DTC is to provide software and infrastructure that aid in transitions between the research and operational communities.  The American Meteorology Society (AMS) provides an ideal venue for sharing these tools with the community through the AMS short course offerings at the Annual Meeting.  The DTC is looking forward to presenting two short courses at the 100th AMS Annual Meeting in Boston, MA in January 2020. 

“Integrating Numerical Weather Prediction (NWP) System Components Using Container Technology and Cloud Services”

The goal of this course is to raise awareness about tools and facilities available to the community for testing and evaluating Numerical Weather Prediction (NWP) innovations, including the emerging set of software tools in reusable containers and cloud computing resources, through hands-on learning. Containerized software is used to bundle all operating systems, code, library dependencies, and executables needed to both build and run software packages on any computing environment.  The DTC has leveraged this technology to create a portable end-to-end system comprised of various NWP components such as the Weather Research and Forecasting model (WRF), Gridpoint Statistical Interpolation (GSI) data assimilation system, Unified Post Processor (UPP), and Model Evaluation Tools (MET). These can be executed on any platform, including in the cloud, without the typical upfront time and frustration of building the software packages from the ground up. The course will introduce the concept of containerized software, provide an overview of the NWP components available from the DTC, and offer a hands-on tutorial that will allow participants to use the containers to complete case study examples using cloud services.  While this course may appeal to a wide-reaching audience, this information may be particularly useful to undergraduate and graduate students interested in learning more about NWP and to university faculty that may find software containers and cloud computing to be useful teaching tools to add to their course curriculum. The online tutorial for the end-to-end NWP containers is publicly available at: https://dtcenter.org/community-code/numerical-weather-prediction-nwp-containers/tutorial-version-3.


“Experimentation and Development of Physical Parameterizations for Numerical Weather Prediction Using a Single-Column Model and the Common Community Physics Package (CCPP)”

This half-day course will teach participants how to develop and experiment with physics parameterizations within the CCPP framework. CCPP is the mechanism adopted by NOAA to drive atmospheric physics within Unified Forecast System (UFS) applications. NCAR also plans to use CCPP in their modeling systems (e.g., WRF, MPAS, CESM). A single-column model will be used to demonstrate how the CCPP framework works and to expose participants to physics suites available in the CCPP. The use of prepared, observationally-based cases combined with this tool’s computational simplicity will allow participants to grasp relevant concepts related to the CCPP and to conduct basic experiments. Graduate students, physics developers and researchers, as well as those with a general interest in working within NOAA frameworks could benefit from attending this course.


AMS Short Course

100th AMS Meeting

Model Evaluation for Research Innovation Transition (MERIT)

Contributed by Jeff Beck
Spring 2019

The Model Evaluation for Research Innovation Transition (MERIT) project provides a critical framework for physics developers to test innovations within their schemes using selected meteorological cases that have been analyzed in depth.  Comparing their results to baseline MERIT simulations will allow developers to determine whether their innovations address model shortcomings and improve operational numerical weather prediction.

For the DTC’s AOP 2018, three high-impact global FV3 baseline cases were selected for in-depth analysis: the Mid-Atlantic blizzard of January 2016, Hurricane Matthew, and the May 2017 severe weather outbreak in the Southern Plains.  These cases were chosen after consultation with the Model Evaluation Group (MEG) at NOAA’s Environmental Modeling Center, as each case exhibits known deficiencies in the global configuration of the Finite-Volume Cubed-Sphere (FV3) model.  Multiple-day simulations were run using an end-to-end workflow developed to handle the pre-processing of initial conditions, the integration of the model, post-processing with the Unified Post Processor (UPP), and verification with the Model Evaluation Tools (MET).  Also, in collaboration with the MEG, the MERIT team has been working on developing and applying unique verification techniques and metrics that will help assess the impact that physics innovations may have on these known FV3 biases. In particular, the progression of certain meteorological features will be assessed through the MET Method for Object-Based Diagnostic Evaluation (MODE) time-domain/storm-relative feature analyses.

The initial focus of MERIT continues to be on existing capabilities available in the global model framework. However, this activity is expected to include high-resolution/convection-allowing modeling as the Stand Alone Regional (SAR)-FV3 becomes available. Providing the research and operational communities with an end-to-end framework will streamline the testing process, leading to more effective and efficient physics development.  In addition, it will also encourage community engagement and provide an infrastructure that supports R2O and O2R. 

See https://dtcenter.org/eval/meso_mod/merit/.


MERIT's three high-impact global FV3 baseline cases.

The DTC helps the research community enhance the GSI/EnKF operational data assimilation system

Winter 2019

Gridpoint Statistical Interpolation (GSI)/Ensemble Kalman Filter (EnKF) are operational data assimilation systems, open to contributions from scientists and software engineers from both operations and research. The development and maintenance of NOAA GSI/EnKF data assimilation systems are coordinated and managed by the Data Assimilation Review Committee (DARC), which incorporates all major GSI/EnKF data assimilation development teams in the United States within a unified community framework. DARC established a code review and transition process for all GSI/EnKF developers, reviews proposals for code commits to the GSI/EnKF repository and ensures that coding standards and tests are being fulfilled. Once DARC approves, the contributed code is committed to the GSI/EnKF code repository and available for operational implementation and public release.

The Developmental Testbed Center (DTC) Data Assimilation (DA) Team serves as a bridge between the research and operational communities by making the operational data assimilation system available as a community resource and by providing a mechanism to commit innovative research to the operational code repository. Prospective code contributors can contact the DTC DA helpdesk to prepare, integrate, and document the expected impact of their code and ensure that any proposed code change meets GSI coding standards. They can also apply to the DTC Visitor Program for their DA research and code transition.

Since the code transition procedures were established, the DTC DA team has helped many researchers contribute code to the repository:

  • NOAA/GSD and NCAR MMM scientists improved chemical initial conditions for WRF-Chem and GO-CART forecasts by using WRF-Chem and GOCART as background to analyze surface measurements of fine particulate matter (PM2.5) and MODerate resolution Imaging Spectroradiometer (MODIS) total Aerosol Optical Depth at a wavelength of 550 nm. These functions are available through the DTC GSI release and have been used by many researchers including Barbara Scherllin-Pirscher from the Central Institute for Meteorology and Geodynamics, Vienna, Austria. Scherllin-Pirscher is a DTC visitor working to further enhance GSI chemical analysis by assimilating vertical light detection and ranging (LIDAR) measurements to improve the vertical aerosol representation in WRF-Chem forecasts.

  • The DTC hosted Mengjuan Liu from the Shanghai Meteorological Service to study how to use GSI to improve the surface data analysis. Liu found the conventional observation operator can introduce large representativeness errors when surface conditions are inhomogeneous, such as on coastlines. An improved forward model for surface observation along the coastline was developed and added to GSI repository. This new forward observation operator was used in the recent operational Rapid Refresh/High-Resolution Rapid Refresh (RAP/HRRR) update.

  • The DTC Visitor Program hosted Ting-Chi Wu from CIRA/CSU to add the capability to assimilate solid-water content path (SWCP) and liquid-water content path (LWCP), which are satellite retrieved hydrometeor observations from Global Precipitation Measurement (GPM) from the Goddard PROFiling algorithm (GPROF).

  • The DTC Visitor Program hosted Karina Apodaca from CIRA/CSU to incorporate two new lightning flash rate observation operators suitable for the Geostationary Operational Environmental Satellite (GOES)/Global Lightning Mapper (GLM) instrument in the GSI variational data assimilation framework.  One operator accounts for coarse resolution and simplified cloud microphysics in the global model to evaluate the impact of lightning observations on the large-scale environment around and prior to storm initiation. Another forward operator for use with non-hydrostatic, cloud-resolving models permits the inclusion of precipitating and non-precipitating hydrometeors as analysis control variables.

The GSI/EnKF code commit procedures established by DARC and the DTC successfully moves innovative contributions into the repository. The DTC and its Visitor Program is a great resource for the research community to introduce new techniques and model components to advance numerical weather prediction technology.

Contributed by Ming Hu and Chunhua Zhou.

2018 DTC Community Unified Forecast System Test Plans and Metrics Workshop

July 30 - August 1, 2018

Summer 2018

The 2018 DTC Community Unified Forecast System Test Plan and Metrics Workshop was held at NOAA’s National Center for Weather and Climate Prediction (NCEP) on July 30 - August 1, 2018. The major goal of this workshop was to work towards a community test plan with common validation and verification metrics for the emerging Unified Forecast System (UFS). The plan will serve as a guide for the Numerical Weather Prediction (NWP) community for testing and evaluating new developments for the UFS models and components by comparison of both historical and real-time forecasts using observations and analyses, through standardized hierarchical testing.

The workshop organization was led by the Developmental Testbed Center (DTC), and the organizing committee was representative of various aspects of the NWP verification and validation (V&V) enterprise, including voices from those working on research, development, transitions, and operations. The membership of the organizing committee was:

  • Curtis Alexander - NOAA/Earth Systems Research Laboratory Global Systems Division (GSD)

  • Ligia Bernardet - CU/Cooperative Institute for Research in Environmental Studies (CIRES) at NOAA/GSD and DTC

  • Tara Jensen - National Center for Atmospheric Research (NCAR) and DTC

  • Jim Kinter - George Mason University/Center for Ocean-Land-Atmosphere Studies (COLA)

  • Sherrie Morris (NOAA Office of Science and Technology Integration (OSTI)

  • Jason Levit (NOAA/NCEP/Environmental Modeling Center (EMC)

  • Ryan Torn (State University of New York at Albany)

  • Ivanka Stajner (NOAA OSTI on detail to NOAA/NCEP/EMC)

The workshop was attended by approximately 100 participants, cross-cutting through various sectors of the V&V community, including international (Taiwan Central Weather Bureau and European Center for Medium-range Weather Forecasting (ECMWF), universities, NASA, NOAA National Environmental Satellite, Data, and Information Service (NESDIS), research laboratories, Office of the Federal Coordinator for Meteorological Services and Supporting Research, National Weather Service, testbeds, Navy, U.S. Air Force, and the private sector.

The workshop had a mix of presentations, discussion periods, and working sessions in which participants contributed to the three topic-based breakout sessions: test plans, metrics, and hierarchical testing.  Metrics for all spatial and temporal scales for numerical weather prediction models and emerging topics such as the verification of convective allowing models, coupled earth system models, and ensemble systems were discussed.

The last activity in the workshop was a summary of the working sessions’ discussions by their leads, which was presented to workshop participants and members of the UFS Strategic Implementation Plan (SIP) meeting. A report on the progress made during the workshop will be available in the next few months.

A link to the workshop page is here: 2018 DTC Community Unified Forecast SystemTest Plans and Metrics Workshop

 

2018 DTC Community Unified Forecast System Test Plans and Metrics Workshop Attendees
2018 DTC Community Unified Forecast System Test Plans and Metrics Workshop Attendees

 


DTC staff host AMS Short Course on Containers

Contributed by Jamie Wolff
Spring 2018

A major hurdle for running new software systems is often building and compiling the necessary code on a particular computer platform. In recent years, the concept of using “containers” has been gaining momentum in the numerical weather prediction (NWP) community. This new container technology allows for the complete software system to be bundled (operating system, libraries, code, etc.) and shipped to users in order to reduce the spin-up time, leading to a more efficient setup process. A core mission of the DTC is to assist the research community in efficiently demonstrating the merits of new model innovations.  The development and support of end-to-end NWP containers is in direct support of that mission.

In recent years, a number of NWP software components (including pre-processing, the model itself, post-processing, graphics generation, and statistics computation and visualization) were implemented into Docker containers to better assist the user community. The work conducted by DTC staff leveraged previous efforts of the Big Weather Web (http://bigweatherweb.org), which initially established software containers for the WRF Pre-Processing System (WPS), Weather Research and Forecasting (WRF) model, and NCAR Command Language (NCL) components. From there, DTC staff expanded the containerized tools to include the Unified Post-Processor (UPP), Model Evaluation Tools (MET), and METViewer. Through this complementary work, a full end-to-end NWP system was established, allowing for verification of the model output and visualization of the statistical output.

Several DTC staff (Kate Fossell, John Halley Gotway, and Tara Jensen, and Jamie Wolff) hosted a short course at the 98th Annual AMS meeting in Austin, TX on 6 January 2018 that offered hands-on experience with the established software containers. In preparation for the short course, an online tutorial was created that can be accessed at: https://dtcenter.org/met/docker-nwp/tutorial/container_nwp_tutorial/index.php. If you are an undergraduate/graduate student, university faculty, or researcher who is interested in these new tools, please check it out! Participants of the 2018 short course were complementary of the day-long tutorial, and the DTC plans to offer it again next year at the Annual meeting in Phoenix, AZ – stay tuned for more information regarding future training opportunities.


DTC staff and trainers

WRF Users' Workshop - June 2017

Autumn 2017

The first Weather Research and Forecasting model (WRF) Users’ Workshop was held in 2000. Since then, eighteen annual workshops have been organized and hosted by the National Center for Atmospheric Research in Boulder, Colorado to provide a platform where developers and users can share new developments, test results, and feedback. This exchange ensures the WRF model continues to progress and remain relevant.

The workshop program has evolved through the years. In 2006, instructional sessions were introduced, with the first focused on the newly developed WRF Pre-processing System (WPS). The number of users has grown since the WRF Version 3 release in 2008, so a lecture series on the fundamentals of physics was introduced in 2010 to train users to better understand and apply the model. Since that time, the series has covered microphysics, planetary boundary layer (PBL) and land surface physics, convection and atmospheric radiation. The series then expanded to address dynamics, modeling system best practices, and computing.

The 18th WRF Users’ Workshop was held June 12 – 16, 2017. The workshop was attended by 180 users from 20 countries, including 57 first time attendees, and 130 papers were presented. The first afternoon of the workshop, four lectures covered the basics of ensemble forecasting, model error, verification and virtualization of ensemble forecast products. The following days included nine sessions on a wide range of WRF model development and applications. On Friday, five mini-tutorials were offered on WRF-Hydro, Model for Prediction Across Scales (MPAS) for WRF Users, Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers (VAPOR), NCAR Common Language (NCL) and WRF-Python. All workshop presentations are available from http://www.mmm.ucar.edu/wrf/users/workshops/WS2017/WorkshopPapers.php.

The WRF Modeling System Development session included the annual update, plus status reports on WRF Data Assimilation (WRFDA), WRF-Chem, WRF software, Gridpoint Statistical Interpolation (GSI), Hurricane WRF (HWRF) and WRF-Hydro. A hybrid vertical coordinate was introduced in Version 3.9 for the Advanced Research WRF (ARW) that may potentially improve prediction in the upper-air jet streak region. Another notable addition to the model is the predicted particle properties or P3 scheme, a new type of microphysics.

Both data assimilation and model physics were improved in the operational application of WRF in the Rapid Refresh (RAP) and High-Resolution Rapid Refresh (HRRR) models. Advances were made in the Grell-Freitas cumulus scheme, Mellor-Yamada-Nakanishi-Niino (MYNN) PBL scheme and the Rapid Update Cycle (RUC) Land Surface Model (LSM). The HWRF operational upgrade included a scale-aware Simplified Arakawa Schubert (SAS) cumulus scheme, new Ferrier-Aligo microphysics schemes, and improved data assimilation. Other development and applications of WRF were also presented. Notably, the large-eddy simulation (LES) capability has been extended to many real-data applications in recent years.

There were two discussions during the workshop devoted to physics suites and model unification. Two suites of pre-selected physics combinations are now available in V3.9 that are verified to work well together for weather prediction applications. The second discussion was about model unification between WRF and the newer MPAS. While the two models remain independent, both are supported by the community and aspects of their development effort can be shared.

The next WRF Users’ Workshop will in June 2018.


Science Advisory Board Meeting - Sept 14-15

Winter 2017

The DTC hosted its annual DTC Science Advisory Board (SAB) meeting on September 14-15, 2016.  This annual event provides an opportunity for the DTC to present a review of DTC key accomplishments over the past three years to representatives of the research community and to collect input directed at shaping a strategy for the future.

 “The DTC enables and supports a wide variety of research-to-operations and operations-to-research activities,” said SAB Chair Russ Schumacher. “The SAB discussed ways that these activities can be further strengthened, including through model evaluation and maintaining operational model codes.  A topic of particular discussion was NOAA and NCEP’s goal to move toward a unified modeling approach.  This brings with it some great opportunities to connect the research and operational communities to enhance weather prediction, but will also pose challenges in familiarizing researchers with new modeling systems.”

The agenda included an Operational Partners Session, an opportunity for the new Environmental Modeling Center (EMC) Director Dr. Mike Farrar to present a vision for NOAA’s unified modeling, followed by the Air Force outlook for its modeling suite by Dr. John Zapotocny. DTC task area presentations reviewed key accomplishments with a focus on research to operations, and presented thoughts for possible research to operations activities, risks, and challenges within the coming three years. Break-out task area group discussions were productive, followed by the SAB recommendations session. The recommendations are detailed in the DTC SAB meeting summary, but the following are a few highlights.

With NCEP’s transition to a unified modeling system centered around the Finite Volume Cubed Sphere (FV3) dynamic core, the SAB voiced their belief that the community will need the DTC’s leadership in supporting FV3 and the NOAA Environmental Modeling System (NEMS) in future years, and the DTC should build up internal expertise in advance of these transitions. They noted no other organization in the U.S. has a core responsibility to be an unbiased evaluator of Numerical Weather Prediction (NWP) models, and advised the DTC to not lose sight of their unique function. As the NGGPS paradigm emerges, the SAB encourages the DTC leadership to keep close attention on where future DTC funding might be anticipated, and make sure future SAB members have expertise in those funding areas.

The SAB recommended the DTC build an interactive research community to hear from active users with fresh ideas and provide a conduit between research now and possible operations in the future. It would be a good forum for interacting with active scientists in other areas of modeling and would be a promising mechanism for getting users involved in the DTC Visitor Program. They also encouraged further engagement with the global ensemble, convection-allowing ensemble, data assimilation, and verification communities.

To support DTC-supported code and address the DTC staffing problem, the SAB suggested a graduate student model. This could build and enhance capabilities of upcoming researchers relevant to the needs of operational models. It would also benefit the university community by supporting students with a gap in funding -- a win-win situation for both the DTC and the university.

The SAB recommended the DTC to continue their work to put supported codes into Docker containers, an open-source project that automates the deployment of applications inside software containers for community use. This significantly reduces challenges associated with setting up and running code on different platforms, and building the libraries the codes use. They also suggested clarifying the roles of model and code developers, the DTC, and the user community in accessing, supporting, and adding new innovations to reduce possible confusion and redundancy.

Because all DTC task areas require robust verification tools to achieve their objectives, the SAB recommended strengthening and supporting collaborations that already exist between the verification task and the other task areas. This may involve increasing the flexibility of Model Evaluation Tools (MET) to support the needs of both DTC tasks and communities. The SAB also indicated there needs to be clear pathways for those who develop new verification tools or methods to have those tools incorporated into MET. They also encouraged the DTC to have more year-to-year continuity in testing and evaluation activities to increase productivity and yield more fruitful outcomes, and to continue to thoughtfully balance these activities on a task-by-task basis.

The external community Science Advisory Board acts as a sounding board to assist the DTC Director, and provides advice on emerging NWP technologies, strategic computer resource issues, selection of code for testing and evaluation, and selection of candidates for the visiting scientist program. Members are nominated by the Management Board, and the Executive Committee provides final approval of SAB nominations for a 3-year term. Current members of the DTC Science Advisory Board can be found at http://dtcenter.org under governance.

NGGPS Atmospheric Physics Workshop

November 2016

Summer 2017

On 8-9 November 2016, more than 80 scientists from a broad cross-section of the physics development community gathered for the Next Generation Global Prediction System (NGGPS) Atmospheric Physics Workshop at the NOAA Center for Weather and Climate Prediction. The workshop provided an opportunity for the NGGPS Physics Team to revisit and refine its near- and long-term priorities for advancing the National Centers for Environmental Prediction (NCEP) global physics suite and identify key areas that need attention. Current plans are to deliver the advanced physics suite by October 2018. 

Workshop participants proposed the following approaches to advance NCEP’s global physics suite:

  • Upgrade the radiation code to RRTMGP, a restructured and modern version of RRTMG (Rapid Radiative Transfer Model for Global Climate Models), to allow more interactions between advanced schemes, such as different radiative processes for separate distributions of cloud ice and snow, and to significantly increase the speed of this computationally costly component of the physics.
  • Determine an approach for selecting a microphysics scheme from a list of strong candidates that meets NGGPS priorities/goals and proceed with an open selection process.  This approach should include both testing metrics and an assessment of how the scheme would set up the physics suite for future advancement.  General consensus of the breakout group discussions was that the Thompson scheme is a strong candidate for NGGPS testing.
  • Convection and boundary-layer schemes would follow the idea of having an evolved suite and an advanced suite that can be compared with suitable metrics. It is recognized that decisions regarding these physics components need to be taken in the context of a suite with other candidate components.
  • The land-surface model will include updated data, and will evolve based on process studies for individual components. The final model will likely come from selected Noah-MP options.


The workshop discussions laid the groundwork for a collaborative framework that will allow the research and operational communities to efficiently and effectively accelerate the advancement of NCEP’s global physics suite.  At the center of this framework is the Interoperable Physics Driver (IPD)/Common Community Physics Package (CCPP) concept and a clearly defined and documented systematic process to select and advance innovations that define the composition of future operational physics suites.

While the workshop discussions made great strides towards defining the key aspects of this collaborative framework, many of the details of the decision-making process still remain to be defined and vetted.  As these final details are worked out, it will be imperative to make sure the process is open and transparent, and takes into account the needs of both the research and operational communities.

Sea Ice Modeling Workshop

Contributed by Ligia Bernardet
Spring 2016

A sea ice modeling workshop was convened on 3-4 February 2016 in Boulder, CO for the purpose of informing NOAA on the inclusion and selection of a community-contributed sea ice model into the future Next-Generation Global Prediction System (NGGPS). Another workshop goal was to identify potential research and development opportunities and gaps.

The workshop was hosted at the National Center for Atmospheric Research (NCAR) by the DTC’s Global Model Test Bed, and counted as sponsors the National Weather Service, the NOAA Office of Atmospheric Research Climate Program Office, and the Office of Naval Research. Sixty-five scientists attended the workshop, representing a broad spectrum of research and operational organizations, such as universities, NOAA, Navy, Department of Energy (DoE), the US National Ice Center, National Snow and Ice Data Center, NCAR, NASA, and DTC. International participation included representatives from the UK Meteorological Office, U. Reading, U. Toronto, and Environment Canada.

NOAA’s NGGPS will be a single fully coupled Earth modeling system with application to forecasts from days to seasons, spanning spatial scales from 1 to 25 km. While the sea ice model to be selected for inclusion in NGGPS needs to have good performance for all forecast applications over these time and space scales, this workshop focused on the short- and medium-term sea ice forecast needs. Workshop participants reviewed several state-of-the-art sea ice modeling efforts, along with various Earth modeling systems, such as the NCEP Keeping Ice’S Simplicity (KISS) model, the DoE Community Ice CodE (CICE) model, the NOAA Geophysical Fluid Dynamics Laboratory Sea Ice Simulator (SIS), the NOAA National Centers for Environmental Prediction Climate Forecast System (CFS), the Canadian Regional Ice-Ocean Prediction System (RIOPS), the ESRL Regional Arctic System Model (RASM) and the Navy Research Laboratory Arctic Cap Nowcast/Forecast System (ACNFS).

It was noted that the majority of advanced sea ice models have similar physical parameterizations, and that the differences among advanced models is smaller than the uncertainty due to initial conditions and external forcing. Given that the use of a community-contributed and supported model in NGGPS was raised as a priority for model selection, participants recommended the tentative adoption of the CICE model, pending follow-up testing and addressing concerns raised regarding the CICE model governance and potential for introduction of numerical artifacts due to differences in staggering between the grids used in the ocean models and in CICE.

The framework for follow-up testing was discussed and important points regarding resolution, domain, coupling, verification metrics, and observations were raised. The importance of the ocean model for ice model performance was also discussed, and will need to be a factor in the test design. These issues will be considered by a tiger team formed to design and conduct the tests. A variety of synergistic project efforts in the scientific community were identified and recommendations for future model development were put forth. Test results will be used to prioritize future model development efforts.

For more information on the NGGPS sea ice modeling workshop and its presentations, please visit http://www.dtcenter.org/events/workshops16/seaice


Sea Ice Workshop group photo

New HWRF Developers Website: R2O for Hurricane Model Development

Contributed by Ligia Bernardet
Winter 2015

The mission of the DTC is to accelerate the rate of transition of new research and development to operational numerical weather prediction models. To that end, the DTC makes NCEP operational models, such as the Hurricane WRF, available to the general community through yearly releases of stable, well-tested, and well-documented codes, which are supported through a help desk.

While the DTC has hundreds of registered HWRF users, only a small subset of them actually contribute innovations, raising questions about the return on the DTC’s investment.



To address this concern, an additional type of support, targeted to this select group of active developers, has been launched by the DTC with support from the Hurricane Forecast Improvement Project (HFIP). Through the HWRF developers’ website (http://www.dtcenter.org/HurrWRF/developers), scientists external to EMC can request access to the HWRF code repositories, giving them access to retrieve and contribute to experimental codes. They can also obtain information about the HWRF code management process and the steps to get their code made available for consideration by EMC. Finally, scientists can get training on advanced HWRF aspects not made available to the general community, such as the HWRF build system and HWRF automation with the Rocoto Workflow Manager System.

This new DTC service, which goes well beyond what is provided to the general community through public releases, has been extensively used by many HWRF developers, and has been particularly helpful to the principal investigators funded by HFIP.

An HWRF Tutorial in Taiwan

Contributed by Ligia Bernardet, photos by Bill Kuo
Summer 2014
Tim Brown-DTC, Qingfu Liu-EMC, Yong Kwon-formerly of EMC, Ligia Bernardet-DTC, Vijay Tallapragada-EMC, and Sam Trahan-EMC some of the HWRF instructors, May 2014, Taipei, Taiwan.

The Hurricane Weather Research and Forecasting model (HWRF) is a U.S. operational hurricane prediction model used by the National Hurricane Center for tropical cyclone track and intensity forecasts in its basins of responsibility: North Atlantic and Eastern North Pacific. However, HWRF can be employed in any basin. In 2013 the HWRF real-time runs conducted by the NOAA Environmental Modeling Center (EMC) for the West Pacific basin were found to be very valuable by the Joint Typhoon Warning Center (JTWC). Because of the demonstrated skill of the HWRF model and its advanced capabilities, there has been a strong interest in HWRF from the research community as well as the international weather centers that are responsible for tropical cyclones forecasting. Currently, there are more than 1000 registered users for HWRF. With a goal of encouraging the participation of international research and operational community in the development and applications of HWRF, the NOAA Hurricane Forecast Improvement Project (HFIP) sponsored an HWRF tutorial in Taipei, Taiwan, 22-23 May 2014. The HWRF Tutorial was held immediately following the Workshop on Numerical Prediction of Tropical Cyclones, 20-21 May 2014, which was attended by about 60 scientists from Taiwan, U.S., China, Japan, S. Korea, India, Vietnam, Thailand, the Philippines, and Malaysia. Fred Toepfer, HFIP Program Director, gave a keynote speech at the workshop. The HWRF Tutorial was organized jointly by DTC, EMC, HFIP, Taiwan’s Central Weather Bureau (CWB), and the Taiwan Typhoon Flood Research Inst (TTFRI). Twenty-six students from Malaysia, the UK, Thailand, Vietnam, USA, Singapore, and Taiwan participated. The tutorial instructors included Robert Gall of HFIP, Vijay Tallapragada, Young Kwon, Sam Trahan, Qingfu Liu, and Chanh Kieu of EMC, and Timothy Brown and Ligia Bernardet of DTC. The feedback from the students was overwhelmingly positive, in spite of the torrential rain of 14 inches in 24 hours which fell in Taipei during the event! We anticipate an increased use of HWRF in the West Pacific typhoon community in the years to come, which will lead to valuable collaboration on the continued development on HWRF.


Students in the classroom

Vijay Tallapragada lecturing

Community Software Maintenance and Support

Contributed by Laurie Carson
Winter 2014

One function of the DTC has been to archive and maintain important model-related code, and to make it available to operational and research segments of the meteorological community. As Laurie Carson describes it, the code maintenance and support function has important objectives in both O2R and R2O arenas: for the former, providing operational software to the research community, and for the latter, facilitating transfer of research capabilities to operational software packages. DTC’s approach is based on a philosophy that community software is a resource shared with a broad community of (distributed) developers specifically including the capabilities of operational systems. Two keys to its success are periodic public releases that include new capabilities and techniques, and effective user support. The chart summarizes present and planned DTC software support activities in five principal areas: WRF model updates and support, data assimilation (GSI) code releases and support, the end-to-end operational hurricane forecast system (HWRF), verification package maintenance and support (MET), and planning for a future community package of the NOAA Environmental Modeling System (NEMS) that includes the NMMB model. Some community code now supported in this way has derived from DTC visitor projects; an example is the field alignment technique described in the 2012 visitor project of Sai Ravela (summary available at http://www.dtcenter. org/visitors/year_archive/2012/). For further description of the DTC community software efforts, see http://www. dtcenter.org/code/.

 



“The DTC software maintenance task has both O2R and R2O objectives.”

As the chart indicates, another community outreach-related DTC activity involves arranging and contributing to workshops and tutorials to facilitate use of these community model and analysis packages. A future issue of Transitions will summarize recent and upcoming events of this kind.

The 2014 GSI Community Tutorial

GSI review Committee Meeting

Autumn 2014

The DTC hosted the 5th Community Gridpoint Statistical Interpolation (GSI) Tutorial on July 14-16 of this summer at the NCAR Foothills Laboratory in Boulder, Colorado. One of several outreach events sponsored recently by the DTC, this tutorial was held in collaboration with other major GSI development teams from around the United States. With an ultimate goal of providing operational capabilities to the research community, this series of tutorials has become a primary training resource whereby both operational and research users can gain knowledge essential to running and further developing GSI.

The tutorial this year was a three-day venture that included both invited lectures and practical hands-on sessions relevant to GSI. Within the program were lectures designed to cover both fundamental (e.g., compilation, execution, and di agnos t i c s ) and advanced (pre-processing, radiance and radar data assimilation, hybrid techniques, and GSI infrastructure) topics.

Lecturers and practical session instructors were invited from major GSI development/support teams, including NCEP/EMC, NASA/GMAO, NOAA/ESRL, and NCAR/MMM, along with DTC members from NOAA/ESRL and NCAR/ RAL. The principal guest speaker from the university community this year was Dr. Milija Zupanski from Colorado State University. Attended by 41 students from the U. S. and other international agencies and universities, the tutorial easily reached maximum capacity.

Tutorial presentations and lectures are posted at http:// www.dtcenter.org/com-GSI/users/docs/index.php. For more information about the GSI system itself and its community support, please visit: http://www.dtcenter. org/com-GSI/users/index.php.



On July 17 after the Community Gridpoint Statistical Interpolation (GSI) Community Tutorial (summarized on page 4), the GSI Review Committee also met at the NCAR Foothills Laboratory in Boulder. Established in 2010, this committee continues to coordinate GSI development from both operational and research communities, and is also responsible for reviewing and approving GSI code updates. During general review of ongoing GSI development efforts and future plans for GSI, the committee specifically discussed potential community support of the NOAA Ensemble Kalman Filter (EnKF) system, which is currently a part of the GSI-based hybrid ensemble variational system of the NOAA Global Forecast System (GFS), and a potential candidate for other operational applications. The decision was then made to establish code management for this EnKF system that follows the existing GSI code management protocol. As a consequence, the GSI review committee effectively becomes a joint review committee for both GSI and EnKF, and new membership (NOAA/ESRL, and the University of Maryland as a deputy member) was approved to represent the EnKF development effort. This new DA review committee thus includes members from NCEP/ EMC, NOAA/ESRL, NASA/GMAO, NESDIS, AFWA, NCAR, the University of Maryland, and the DTC.


Object-based Verification at WPC

Contributed by Faye Barthild, NCEP WPC
Summer 2013

The Weather Prediction Center (WPC) at NCEP has been using MODE to supplement its traditional verification techniques since April 2010. The Method for Object-based Diagnostic Evaluation (MODE), a utility that is part of the MET verification package, has been developed with substantial support from the DTC. Both are systematically expanded and maintained for specific DTC tasks and an array of outside users. MODE output is available to WPC forecasters in real time through an internal website that displays graphical verification for forecasts of 24 hr precipitation valid at 1200 UTC (see the figure). Forecasters can select the forecast lead time (Day 1 – 36 hr, Day 2 – 60 hr, or Day 3 – 84 hr) and precipitation threshold (0.50 in, 1.0 in, and 2.0 in), then view the corresponding verification for WPC forecasts and 9 numerical models.


“Two things that seem to resonate with our forecasters the most are the real time aspect of the website and the visual nature of the comparison.”

The graphical nature of the MODE verification allows for a quick comparison of forecasts in a way that goes beyond traditional threat scores and bias values to consider other measures of forecast quality (distance between forecast and observed objects, differences in angle of orientation and object size, etc.). The most recent update to the website attempts to better quantify these qualities by adding statistical comparisons of the interest value and the displacement distance between matched objects to complement the traditional graphical comparisons. Future plans include additional statistical information on the website, including longer term summaries (monthly, annually, etc.), and making the website available to the public.

Running MODE on a national scale at an operational center like WPC can present some unique challenges since MODE must be able to correctly identify precipitation objects from meteorological phenomena as varied as cool season synoptic scale storms to warm season convection. Determining the ideal configuration is still a work in progress, but it is an essential piece of the puzzle in order to build forecaster confidence in the utility of object-based verification.


DTC Science Advisory Board Meeting

Contributed by Mark Stoelinga (Chair of DTC SAB) and Bill Kuo (DTC Director)
Autumn 2013

Given its mission to facilitate the research to operations transition in numerical weather prediction, the DTC has a mandate to stay connected with both the research and operational NWP communities.

As a means to that end, the DTC Science Advisory Board (SAB) was established to provide (i) advice on strategic directions, (ii) recommendations for new code or new NWP innovations for testing and evaluation, and (iii) reviews of DTC Visitor Program proposals and recommendations for selection.

The third meeting of the SAB (the first with the new members announced in an earlier DTC Newsletter) was held recently (25-27 September 2013) in Boulder. To stimulate communication between the research and operational NWP communities, the DTC invited Geoff DiMego and Vijay Tallapragada to present future plans for mesoscale and hurricane modeling at the National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC), and Mike Horner to give an Air Force Weather Agency (AFWA) modeling update. Both informative presentations stimulated considerable discussion.

John Murphy, Chair of the DTC Executive Committee (EC), presented NWS’s view on research to operations and the important role of the DTC. Col. John Egentowich, also a member of DTC EC, spoke positively of the contributions of the DTC to Air Force weather prediction modeling. During summary and discussion periods, SAB members provided many valuable recommendations for the DTC to consider during their planning for the new DTC Annual Operating Plan (AOP) for 2014. These recommendations will be detailed in a DTC SAB Meeting Summary. Of these, several were of particular interest.

The SAB recommended that the DTC and EMC develop a community model-testing infrastructure at NCEP/EMC. The goal for such an infrastructure would be to allow visiting scientists easy access to EMC operational models, enabling them to collaborate with EMC scientists to perform model experiments using alternative approaches. Such an infrastructure would hopefully facilitate accelerated R2O transition in NWP.

The SAB voiced their belief that operational centers will have significantly more computing resources in the near future, putting nationwide high-resolution mesoscale ensemble forecasting within reach. Given that possibility, they recommended that DTC should help facilitate transition to cloud-permitting scale ensemble forecasting with multiple physics. The current members of the DTC Science Advisory Board can be found at http://www.dtcenter.org under governance.