Transitions Newsletter Header

Director's Corner: Russ S. Schumacher

Autumn 2017

Faculty and graduate students at universities typically conduct basic research to better understand the fundamental workings of their area of interest, which in our field is the atmosphere. Transitioning these findings into practical applications, including operational weather forecasting, is then done by national labs and their cooperative institutes.  Yet in many cases, university researchers are working on problems that are directly relevant to operations, and have the potential (with a little help) to be considered for transition into the operational environment.  How to cross the many hurdles associated with this transition, however, is not a topic that is well understood in the academic lab setting, where the project may be developed by a faculty member and one or two graduate students.

Participants in the Flash Flood and Intense Rainfall experiment, forecast discussion at the Weather Prediction Center and Hydrometeorology Testbed during summer 2017.

I've collaborated closely with forecasters and forecast centers in the past, mainly on what might be called "operationally relevant" research -- work that can inform the forecast process but isn't immediately applicable.  This summer, I had my first real experience as a faculty member in formally testing a product that could be considered for operational transition. With support from NOAA's Joint Technology Transfer Initiative, we tested my Ph.D. student’s heavy rainfall forecasts at the Flash Flood and Intense Rainfall (FFaIR) experiment at the Weather Prediction Center (WPC) and Hydrometeorology Testbed during June and July of 2017.

Preparing for the experiment raised several issues of a scientific and technical nature, that I was not really accustomed to having to think about in an academic setting.  Some were fairly mundane, like "how do we generate files in the proper format for the operational computers to read them?"  But others were more conceptual and philosophical: “How should a forecaster use this product in their forecast process?  What should the relationship be between the forecast probabilities and observed rainfall/flooding?  How can we quantify flooding rainfall in a consistent way to use in evaluating the forecasts?”

So why do I bring all of these experiences up in the “DTC Transitions” newsletter? Because one of the DTC’s key roles is to facilitate these types of research-to-operations activities for the broader community (including universities as well as research labs.)  One particular contribution that the DTC makes to this effort is the Model Evaluation Tools (MET), a robust, standardized set of codes that allow for evaluating numerical model forecasts in a variety of ways.  For new forecast systems or tools to be accepted into operational use, they should demonstrate superior performance over the existing systems, and the only way to establish this is through thorough evaluation of their forecasts.  Careful evaluation can also point to areas for additional research that can lead to further model improvements.  The DTC also sponsors a visitor program that supports university faculty and graduate students to work toward operational implementation of their research findings.

Conducting research-to-operations activities in an academic setting will certainly fall outside the comfort zone of many university researchers.  Furthermore, we should be sure not to lose our focus on basic research, which is often best suited to academia.  But the fruits of that basic research are also often ready to take the next step to direct application and broader use, and I encourage fellow academics to test out taking that step, especially with the support and tools offered by the DTC.