Accurate assessment of flood risk is critical to protecting lives and property worldwide. The design and safe operation of dams, levees, culverts, bridges, storm drainage infrastructure, and many nuclear facilities are informed by estimates of an “upper bound” of possible precipitation. In particular, dams and nuclear facilities in populated areas are often referred to as “critical” or “high hazard” due to the risk to life and property a failure presents. These structures might want to be built to withstand the most extreme storm or flood considered possible at that location.
In engineering practice, this concept is called Probable Maximum Precipitation (PMP), and it is defined as the “theoretical maximum precipitation for a given duration under modern meteorological conditions” (WMO 2009). In the United States, PMP is generally estimated using a deterministic “moisture maximization method” (also referred to as the storm-based approach), which combines observations of historical extreme precipitation events in regions relevant to the location of interest with storm maximization assumptions.
Recent advancements in high-resolution weather modeling, particularly the ability to simulate convection explicitly, offer new possibilities for modernising PMP estimation. Dynamical weather models produce spatially and temporally continuous precipitation estimates, often at considerably higher resolution than observations or historical reanalysis datasets. Because these data are produced by solving physical equations of the atmosphere (in contrast to interpolation methods historically employed to make up for limited observations), dynamical model representation of storm physics and evolution also reduces reliance on spatial, temporal, and physical assumptions that currently underpin PMP estimation.
This article describes a method for generating and applying ensembles of high-resolution, state-of-the-art numerical model simulations of historical past extreme precipitation events to meet contemporary stakeholder needs. The method was designed as part of a research-to-application-focused partnership project to update state dam safety rules in Colorado and New Mexico. The results demonstrated multiple stakeholder and user benefits that were applied directly into storm analyses utilized for extreme rainfall estimation, and diagnostics were developed and ultimately used to update Colorado state dam safety rules, officially passed in January 2020.
Advance Reanalysis Datasets for Historical Event Modelling
Upper-air data are particularly important to initial and lateral boundary conditions in a numerical weather simulation, but are exceedingly rare prior to radiosonde launches becoming routine in the 1940s (Durre et al. 2006). Thus, many historical reanalysis products often begin during or after the 1940s, when rawinsonde data coverage became more established. This presents a particular problem for investigating extreme events that occurred prior to the 1940s.
Advances in data assimilation and the innovative use of historic surface observations, have allowed reconstruction of three-dimensional atmospheric states in products such as the National Oceanic and Atmospheric Administration (NOAA)/Cooperative Institute for Research in Environmental Sciences (CIRES) Twentieth Century Reanalysis (20CR) project as far back as the mid-nineteenth century (Compo et al. 2011). Reanalyses are of great value in their own right, providing the ability to examine long-past events, but their potential to serve as initial and boundary conditions also make possible high-resolution mesoscale model simulations of historic storms.
The 20CR version 2c (20CRv2c) was used for most of the simulations conducted in this study, as it was the most recent version of the 20CR at the time of the Colorado-New Mexico Regional Extreme Precipitation Study (CO-NM REPS) project (2016-2018). The 20CRv2c is a 56-member reanalysis product utilizing the NCEP Global Forecast System modeling framework in combination with ensemble Kalman filter data assimilation techniques to incorporate surface observations from 1851 to 2014.
Downscaling Historical Extreme Precipitation Events
Extreme precipitation events were selected based on the needs of the CO-NM REPS study, aimed at updating and improving PMP estimates for Colorado and New Mexico. The storms chosen are listed in Table 1, and were chosen based on their historical importance in defining PMP depths for the region.
All historical event model simulations employ the Advanced Research Weather Research and Forecasting (WRF-ARW) modeling system, version 3.7.1 (Skamarock et al. 2008). Convection-permitting models are necessary to simulate heavy precipitation with acceptable fidelity, especially at subdaily scales, as sufficiently high model resolution (generally ≤4 km) permits explicit simulation of deep convection, which is often critical to generating the types of extreme rainfall that define PMP-type events.
The original CO-NM REPS plan scoped four high-resolution simulations (initialized using four different members of the 20CR ensemble) for each historical event. However, initial WRF ensemble results sometimes raised more questions than answers; for example, impractically large downscaled ensemble spread, or WRF simulations so starkly different from the existing historical analysis so as to be deemed unusable by practitioners. In these situations, additional simulations beyond the standard initial four were performed.
Table 1: The seven selected historical extreme precipitation events used in the CO-NM REPS study.
Year | Location | Max Precipitation | Duration |
---|---|---|---|
1909 | Rattlesnake, ID | 409 mm | 18-24 Nov |
1923 | Savageton, WY | 432 mm | 48 h |
1941 | Houghton, NM | 351 mm | 12 h |
1965 | Red River, NM | 367 mm | 24 h |
1972 | Black Hills, SD | 353 mm | 6 h |
1976 | Rapid City, SD | 304 mm | 6 h |
1978 | Isleta, NM | 330 mm | 6 h |
The 1909 Rattlesnake, Idaho, record rainfall event was the result of a week-long series of inland-penetrating atmospheric rivers, which produced a reported 16.12 in. (409 mm) of precipitation over the period 18-24 November 1909. For this case, a small ensemble of four WRF simulations produced notable internal consistency and agreed closely with available historical observations (CODNR and NMOSE 2018).
The WRF Model precipitation output fields were incorporated into PMP calculation methods first as an improved precipitation “base map” (from which PMP estimation begins), offering a starting point for spatial distribution of precipitation values between observational data points. WRF simulation output was next used to inform a more robust method to delineate regions of rain versus snow (relative to coarser, temperature-based approximations). Updating the preexisting storm analysis generated by interpolating sparse observations with high-resolution, rainfall-only WRF information made significant differences (upward of ~75 mm, or ~3 in.), distributed across the domain.
In contrast, the 1923 Savageton, Wyoming, rainfall event posed more of a challenge. This storm has historically controlled PMP depths for many regional studies but contains tremendous uncertainty related to the storm center rainfall amount and spatial accumulation patterns. Despite more than 15 WRF configurations and initializations tested, the most precipitation generated was an ensemble maximum single gridpoint value of ~50 mm (~4 in.) versus the historical observation of 17.1 in.
The meteorological description for this case provided by historical reports classifies the event as a midlatitude synoptic cyclone with moisture sourced from the Gulf of Mexico. To respect time constraints and not impede the larger CO-NM REPS process, practitioners chose to move on from this case, concluding that the WRF reanalysis did not demonstrate sufficient skill in replicating the historical observation.
Lessons Learned and Opportunities for Improvement
The ensemble downscaling framework yields critical uncertainty information, but also new questions regarding optimal use of the multimember output. The incorporation of individual model member fields versus ensemble diagnostics (e.g., mean, max, spread) was explored via ongoing collaborative discussion with CO-NM REPS practitioners. Individual simulations retain the model-derived benefit of internal physical consistency, while ensemble diagnostics provide useful analytic insight. In this study, it was ultimately uncommon for data from a single, individual model simulation to be deemed robust, reliable, or as useful (relative to the entire ensemble) in isolation. Instead, ensemble diagnostics and intra-ensemble, member-to-member comparisons were key to gaining use and comfort with PMP practitioners.
For the unique challenge of PMP, the ensemble maximum (“ensemble max”) product in particular seems to be an appropriate diagnostic selection. The ensemble max grid retains the maximum event-total precipitation produced at each grid point and thus demonstrates how intense the event was simulated to be, grid point by grid point, across all event ensemble members. In the CO-NM REPS project, individual member model output for each event, along with an ensemble max precipitation grid, was provided to be considered for possible input into the PMP analysis. The ensemble max grid was ultimately selected as the product from which to reevaluate or modify existing precipitation base maps.
Going forward, dynamical model approaches to simulating historical storms for applications such as PMP should establish, a priori, a more structured and exhaustive experimental design, which includes clear standards for the governance of possible application of model results. The 1909 Rattlesnake, Idaho, WRF simulations demonstrated that reconstruction of major historical events via numerical modeling may beneficially supplement existing storm analyses and also improve spatial, temporal, and physical assumptions (e.g., precipitation type) made with very limited observational data. This event (in combination with and compared with others) highlights the role of topography in producing more constrained simulations that may be deemed more valuable to practitioners.
Conversely, for cases where model simulations did not yield the expected, historical observation-indicated precipitation, model data might instead be considered as a tool in flagging potentially erroneous, or at least unacceptably uncertain, observational data. Adding long-term value toward achieving an objective, NWP-generated upper bound of precipitation will require additional work, but the utility of high-resolution model data has been demonstrated for dam safety and flood risk management applications in case-specific efforts through exploratory prototypes.
Conclusion: Towards Improved Flood Control and Dam Safety
The results of this study corroborate prior historical weather event “reconstruction” work, advocating for a complementary approach in which traditional and numerical methods are combined. As further posited, the introduction of gridded, small time-step numerical model data may well alter our foundational understanding of, and perspectives on, historical extreme precipitation events.
Though case study-focused herein, the approach further informs future applications requiring high-resolution, spatially consistent, and/or long-term gridded data. Decision-maker acceptance of the historical downscaling approach demonstrates an increasing appetite for including dynamical modeling more broadly, for example, with respect to addressing nonstationarity in PMP estimation. Complementing the larger, forward-looking movement to improve extreme event risk assessment for hydro-engineering applications, opportunities such as historical event downscaling can offer improvements to current estimates, and in turn better serve society now.
For more information on flood control design, implementation, and maintenance, please visit Flood Control 2015.
Tip: Regularly inspect and maintain flood barriers and drainage systems