The climate graduate programs at George Mason University offer an Earth System Modeling course. The course is divided into two subtopics, theory and practicum. The theoretical session offers lectures introducing students to the physical and dynamical components of an Earth system model, their interactions, and how these components are used to predict the behavior of weather and climate. When I became the instructor of the Earth System Modeling course, I added a module to the practicum session that provides students the technical skills to contribute to the development of an Earth system model. For this module, I opted for the single-column model (SCM) approach. Given the number of Earth system models developed in the U.S. alone, coupled with our aim to familiarize our students with more than one model, the Earth system model and the SCM were selected to come from two modeling groups.
As a researcher, I have experience running various Earth system models, yet never had the opportunity to work with a SCM. My decision to select this model, developed by the DTC as a simple host model for the Common Community Physics Package (CCPP), was influenced by my current work with the NOAA Unified Forecast System (UFS), which uses the CCPP for the majority of physical parameterizations in the atmospheric component. I was further motivated by the detailed user and technical guide that accompanies the public release of the CCPP-SCM code.
I was cautiously optimistic about successfully porting the code to the Mason high-performance computing (HPC) clusters, which are not part of the preconfigured platforms on which the CCPP-SCM code has been tested. If only one step from the list of instructions fails, it can cause a domino effect on the subsequent steps. To my surprise, step after step was successfully completed. The biggest challenge to port the code was building the three libraries that are part of the UFS hpc-stack package. Thankfully, the developers of the UFS hpc-stack have done an excellent job in providing a system for building the software stack. Building the library required an entire day of suspense, yet its successful completion was well worth the wait.
In addition to the relatively easy process of porting and compiling the code, there are other attractive elements in the CCPP-SCM framework that expand its appeal as a teaching tool. It offers a relatively large library of physical parameterizations (or physics suites) that have been scientifically validated, and provides a variety of pre-processed forcing data. These allow students to design experiments to understand the behavior of physical parameterizations in different environments, and explore the limitations of the approach.
Following the developer instructions, which I adapted to work with Mason’s HPC cluster, students quickly installed their own copy of CCPP-SCM and were ready to work on the practical application. The goal of the assignment was to understand the similarities and differences between the behavior of a cloud parameterization scheme when tested over land and ocean environmental conditions. The variety of observations included with the package allow students to focus on the science without spending time on finding the data sets required to drive the SCM.
The outcome of the assignment exceeded my expectations. Students set up their own numerical experiments without any help from me. This was a rewarding experience for me as instructor and for the students who gained confidence they can master a model that allows them to zoom into the complexity of an Earth system model. Next, students will learn how to run an Earth system model and the NCAR CESM will be used for that purpose.