Hurricane Katrina was a pivotal event that dramatically impacted policies and practice with respect to natural hazards. The massive human life and economic losses stimulated rapt attention to what went wrong and why, which in turn resulted in dramatic shifts in both engineering practice and the policies that guide it. This also accelerated national flood safety assessment and future strategy both in the United States and in the Netherlands. It is instructive to compare natural hazard policies prior to Katrina to those that arose following Katrina. How well have these changes prepared us for the future? It is becoming clear that the past is no longer prologue. Now, two decades later, our attention is turning to the growing uncertainty, to include non-stationarity in both natural and social processes. Are policy and practice evolving fast enough? How well are we postured to deal with the spectrum of potential futures?

  • Current water infrastructure is planned on probabilistic approaches.

  • Katrina sparked the beginning of ‘flood risk reduction.’

  • Katrina influenced others to re-evaluate flood risk. Of particular significance is the Netherlands.

  • Scenarios representing a range of future conditions provide a means to understand sensitivity to change.

  • Adaptation provides a means to implement solutions as conditions change.

Hurricane Katrina was a ‘bellwether’ event, exposing the nation's state of preparation and readiness for major natural hazards. This was not the first alarm, historical events resulting in large losses had preceded. Perhaps none had the gravitas of the massive destruction of a city whose vulnerability had been known over decades. Yet, the protective measures in place were too little and uncoordinated. New Orleans became the poster child for our national malaise in dealing with Nature and dangerous land-use policies.

A lesson learned from Katrina and from a recurring series of ever more extreme disasters like Superstorm Sandy, Hurricane Harvey, flooding of the mainstem Mississippi, and a national epidemic of urban pluvial flooding, is that our current approach to planning and preparing for disasters is being overtaken by natural and societal change. A fundamentally new approach to planning needs to be envisaged and we need to ask, are we ready for the future?

The stimulus for improved hurricane protection in South-East Louisiana was Hurricane Betsy in 1965, the first billion-dollar disaster in the United States. The effort to better protect New Orleans was authorized by Congress post-Betsy and slowly evolved through two decades of analysis and debate. The debate was complicated by the then-new policy of cost-sharing between the federal government and local sponsors.

Criteria for planning and design were rooted in deterministic approaches, the hazard was characterized using the ‘Standard Project Hurricane’ (SPH), a historically based event believed to be the reasonably most severe storm. Pre-Katrina modeling was used to anticipate the water surge levels using multiple storm tracks. The maximum surge at any location among the calculated model runs was used as the basis for the design. Structures were sized using then-current criteria modified with a ‘factor of safety’ to handle uncertainty. Unfortunately, the criteria did not include how the system would perform once overtopped, the primary cause of failure during Katrina for all but four of the major breaches. Nor did it consider the primary failure mode responsible for the four floodwall failures. In fact, during Katrina the existing structures, except for those mentioned above, performed as designed, but when overtopping occurred, failed. The overtopping vulnerability was exacerbated by the fact that the structure heights of many sections of the hurricane protection system (HPS) were below authorized elevations (in some cases by 2–3 feet) resulting from an out-of-date geodetic datum and a misinterpretation of the datum versus local mean sea level. Post-Katrina analysis of the existing system characterized the mean time between failures of the HPS structures to be 40 years based on the hazard characterization using the joint-probability method (JPM).

Funding the HPS was complicated by the Reagan Administration's policy of ‘cost-sharing,’ within which one-third of the cost was to be provided by New Orleans and the State of Louisiana. This had a significant influence on the approaches ultimately chosen for the HPS. Even when funding was appropriated, it came sporadically resulting in a slow pace of construction. When Katrina struck, 40 years after the initial authorization, the protection system was only 60–70% complete and the information and criteria on which it was based were significantly out of date. The completion of the initial HPS structures was scheduled for 2015. While considerable information was available that demonstrated New Orleans' vulnerability, little action was taken to accelerate or update the protection measures. A USACE report on Hurricane Protection Decision Chronology provides an interesting insight into the maneuvering that precipitated these gaps (Woolley & Shabman 2008).

Katrina spurred the formation of the Interagency Performance Evaluation Task Force (IPET) (USACE 2009) to examine what went wrong and why, and to form the scientific basis for creating more effective flood mitigation for the future. While IPET did the forensics, the Corps of Engineers (USACE) established Taskforce Guardian to reconstitute protection for the city by creating temporary structures where major breachings occurred. Taskforce Hope was established to plan and construct a future risk reduction system. The efforts of IPET and Guardian were challenged by the need to re-establish some protection around New Orleans before the following 2006 hurricane season. This resulted in the distribution of an initial draft of the IPET report in June 2006, the key results of which were already implemented in the temporary protection created by TF Guardian.

The results of the IPET analyses were documented in a 9-volume report, ‘Performance Evaluation of the New Orleans and Southeast Louisiana Hurricane Protection System,’ June 2009 (https://biotech.law.lsu.edu/katrina/IPET). These were conducted under the scrutiny of an external review by the American Society of Civil Engineers (ASCE 2007) and by a committee of the National Research Council (NRC 2009).

Katrina sparked a major change in philosophy. The demise of ‘Flood Protection’ and the beginning of ‘Flood Risk Reduction.’ Probabilistic hazard analyses replaced deterministic definitions of SPH and Standard Project Flood in USACE doctrine. The new hazard analysis involved JPM concepts using hundreds of hypothetical storms along a wide variety of potential paths. The hypothetical storms drove integrated surge and wave models to generate a comprehensive hurricane hazard providing a spatio-temporal portrait of surge and wave elevations across Southeast Louisiana. This provided a customized hazard definition for each reach around New Orleans with a quantification of the uncertainty in the hazard numbers (Figure 1).
Fig. 1

New Orleans hurricane surge (100-year still water elevations) map following Katrina. Courtesy USACE in the public domain (USACE 2007).

Fig. 1

New Orleans hurricane surge (100-year still water elevations) map following Katrina. Courtesy USACE in the public domain (USACE 2007).

Close modal

By integrating the hazard data with the geometry and structural characteristics for each reach, the potential for breaching and flooding was assessed for the entire system. Modeling structural performance using fragility curves (the probabilities of breech conditioned on wave and surge levels) allowed for robust failure analysis and a credible means to determine the volumes of floodwater. It also permitted probabilistic design and the application of Monte Carlo simulation to examine overtopping potential reach-by-reach. Collectively, the probabilistic approach facilitated a more rigorous evaluation of HPS structures and a more reasoned design of a Risk Reduction System for the future. A new feature of this approach was the ability to apply location-specific uncertainty in water elevation and wave estimates as a substitute for deterministic freeboard and factors of safety.

Flooding volumes were routed within each drainage basin to enable estimates of flood elevations and subsequent losses. The capacity and status of interior drainage systems and associated pumping stations were included in the simulations. Distributed databases for structures and populations at the census block level were used in concert with these data to estimate losses, economic damages, and potential loss of life using demographic models. The flooding and loss information was calibrated to the losses observed in Katrina.

The capstone of the IPET efforts was the first risk and reliability analysis of a large spatially distributed flood protection system. This effort resulted in a means to ascertain the source of the vulnerability of the old HPS and a stark picture of what was required of the new risk reduction system. The risk data and associated maps became a means to communicate vulnerability to the public, local officials, and ultimately Congress. The design process was able to include location-specific estimates of other important factors such as sea level rise and subsidence.

USACE was able to expand on the IPET hazard and structural analyses to create a comprehensive design for the new Flood Risk Reduction System. Congress appropriated full funding for the project and construction began in 2007. By 2011, the new system was complete with upgrades to the drainage and pumping systems, major barriers across the Inner Harbor Navigation Canal at Seabrook and at Lake Borgne, three interior drainage canals leading to Lake Pontchartrain, and the West Bank Closure Complex at the Algiers and Harvey Canals. New Orleans finally had a risk-informed system of systems.

Katrina influenced others to re-evaluate flood risk. Of particular significance was The Netherlands, a country with 40% of its land below sea level. The Netherlands' elaborate network of drainage canals and dikes has allowed it to reclaim and sustain large land areas for habitation and agriculture. Its safety initiatives have included major engineering structures to reduce the vulnerability of the coastal shoreline by dramatically reducing the portion of the coastline subjected to storms of the North Sea. It has also subscribed to rigorous criteria for coastal and riverine dikes, prescribing structure elevations to protect against 1:2,500-year (p = 0.04% annual chance) water elevations and in many cases 1:10,000-year (p = 0.01%) levels depending on the associated land uses. At the time, the Netherlands was engaged in risk assessments and considering even more robust safety standards such as 1:100,000-year (p = 0.001%) hazards. Katrina stimulated the acceleration of that effort and the Netherlands national risk assessment of 2006 demonstrated that considerable portions of their current safety structures were below the new standards. The response was the initiation of the Delta Program which included formulation of the Delta Model.

The Netherlands Delta Program is an initiative to address both current needs and long-term climate changes in the Netherlands. The technical foundation for the program, the Delta Model, involved the creation of a fully integrated set of models that address both the hydrological and social regimes. The model system integrates many of the Netherlands' existing model packages to facilitate the use of common databases for consistency and efficiency.

A unique aspect of the assessment was the use of scenarios, each representing potential future water and land use conditions (Figure 2). None of the scenarios was intended to represent the most likely future situation; instead, they provided a means to understand the capacity of the water infrastructure to manage a range of hypothetical water conditions and associated land use distributions. Since the scenarios included land use and socio-economic factors, the modeling results are relevant to losses and effects concerning land uses and practices. This resulted in a sensitivity analysis to a range of possible future conditions, and the foundation for incremental adaptation of the current system.
Fig. 2

The Netherlands flood protection map, numbers refer to polder identification. Source, NL Flood Defense Act of 1996 (Tsimopoulou et al. 2015).

Fig. 2

The Netherlands flood protection map, numbers refer to polder identification. Source, NL Flood Defense Act of 1996 (Tsimopoulou et al. 2015).

Close modal

The paradigm shift is significant. Rather than specifying a solution for 50 or 100 years in the future, the approach adopted a strategy of making moderate changes when physical and societal changes demand a new level of capability. Changes to infrastructure and flood risk mitigation occur as physical conditions dictate, not aligned with a long-term forecast projecting when those changes will occur. Traditional project life constraints are abandoned, as is designing to a probabilistic threshold. While long-term possible conditions are certainly considered, specific mitigation steps are configured as a series of related measures that can be added if and when needed.

The Katrina experience reinforced the need for up-to-date data and analytical tools, a system-of-systems approach, and adequate resources to enable action within a time frame commensurate with the threat to society. The value of comprehensive hazard and risk information for both understanding vulnerability and designing appropriate risk mitigation was demonstrated. The new Hurricane and Storm Damage Risk Reduction System (HSDRRS) was a game changer for the future viability of New Orleans. In the implementation of the HSDRRS, the USACE upgraded a majority of its engineering and technical guidance documents to include the new tools of risk and resilience in its overall processes.

While these changes signaled a new regime for natural hazard mitigation, legacy concepts persist that may be problematic for the future. First, the concept of a specified project life. The HSDRRS assumed a 50-year project. Designing for a 50-year project life requires the ability to understand the conditions that will exist in 50 years, a requirement that is becoming ever more difficult with the non-stationarity of the hazards and changes in land use and demographics (Stakhiv & Hiroki 2021). The currently proposed ‘Coastal Texas’ project, a plan to reduce risk in the critical Galveston-Houston area, is faced with these same constraints: designing for a 50-year project life when given current funding the project may not be completed in 50 years.

The growing uncertainty about sea level rise, storm frequency, and the performance of ageing infrastructure are all potential Achilles heels for the application of standard probabilistic tools that assume that past data are a prelude to the future. This long-held assumption is in question as the non-stationarity of climatic processes unfolds. Natural processes are not the only sources of non-stationarity. We have long coped with compensating for how changes in land use have modified watershed responses to weather events. Projecting future land use changes is not easy nor is forecasting population demographics. The potential for significant increases in non-stationarity across both natural and societal factors casts a cloud over the utility of traditional probabilistic tools in long-term decision-making.

In its appropriation of funds to build the HSDRRS, Congress stipulated that the new system be built for a 100-year storm; that is, a hazard with a p = 0.01 chance in any year. This is the criteria adopted by the (US) National Flood Insurance Program, the basis of which is a ‘base flood’ by which to judge hazards. (Robinson 2004). While this may be appropriate for insurance purposes, it has flaws. First, flood insurance maps and rates are based on current conditions, giving little consideration to the ability of a mitigation system to perform over its project life. Second, it does not consider the residual risk faced by a community. Many environmental justice advocates think this criterion is too hazardous, and some states and local jurisdictions like California now require a more conservative design (DWR 2007).

While life safety has become more prominent in flood risk discussions, other societal factors are also growing in legal standing. The original use of benefit–cost analysis as a criterion for investment (flood protection measures should provide more benefits than their cost) has become complicated by the desire to include a greater number of non-monetary factors. Benefits now include (1) national economic development (monetized benefits and cost), (2) regional economic development, (3) environmental justice and other social effects, and (4) environmental quality. The challenge is in integrating these four accounts into coherent decision-making.

The last, but perhaps most nefarious issue, is non-stationarity. The growing uncertainty about sea level rise, storm frequency, and the performance of ageing infrastructure are all Achilles heels for the application of standard probabilistic tools. This long-held assumption is in question as the non-stationarity of climatic processes unfolds. However, natural processes are not the only sources of non-stationarity. We have long coped with compensating for how changes in land use (urbanization) and demographics have modified the watershed responses to weather events, and such changes may have as large an effect on risk as climate change (Wing et al. 2022).

The quest is on for a practical approach to consider non-stationarity. USACE published a report in 2018, Floods and Non-Stationarity, A Review, CWTS 2018-01. This report, based on a 2010 multi-agency workshop, examined approaches to extending the basic concepts of frequency, risk, and reliability to allow application to non-stationary data. Most past efforts have been focused on normalizing past data to current conditions to enable the application of standard statistical tools. Expanded approaches considered assuming different equilibrium states, each analyzed as stationary populations.

Similar methods can certainly be applied to examine future conditions, that is normalizing a future situation to current conditions, however, assigning a likelihood to the future scenario is problematic. As Neils Bohr said, ‘Forecasting is difficult, especially if it involves the future.’ Forecasting futures become increasingly questionable as the relevance of past trends (the usual baseline for forecasting) becomes less relatable. When going beyond the changes in the physical environment (i.e., weather) the complexities multiply. Projecting future land use changes is not easy, nor is forecasting property values, population density, and demographics.

The potential for significant increases in non-stationarity across both natural and societal factors casts a cloud over the veracity of routine use of traditional probabilistic tools and their role in long-term decision-making. However, this does not diminish the value of examining future scenarios and using them as a decision aid. Understanding sensitivity to a spectrum of possible changes can provide a foundation for planning. The viability improves if one adopts incremental adaptation rather than project life, pinning decisions to implement step-by-step adaptive measures as physical or societal conditions dictate. This, however, is disruptive with respect to our current planning and project implementation processes.

The post-Katrina regime brought probabilistic methods and sophisticated analytical tools to bear and considerably upped the game for analysis and forecasting. Quantitative uncertainty replaced deterministic metrics for dealing with unknown and extreme events. The need for more frequent updating of standards and guidance was obvious and many were reconstituted to include risk concepts as a component of decision-making. Funding for major projects, at least the Katrina recovery effort, and reconstitution of flood risk reduction in New Orleans, was revolutionized. Similar changes occurred in the seismic hazard analysis and risk assessment, including nationwide analyses and information that formed the basis of new design criteria and standards. Following Sandy, USACE began a series of regional risk and resilience analyses starting with the North Atlantic Comprehensive Coastal Study. There is nothing like a disaster to evolve analysis and practice.

Since Katrina we have entered a regime where the probabilistic methods have been expanded to provide comprehensive data sets for hazard events, enabling a more sophisticated understanding of the vulnerability of communities and a better foundation for mitigation of the risk, at least in the near term. Non-stationarity has raised its head with the threat of diminishing the veracity of traditional statistical approaches. Particularly for areas threatened by coastal storms and sea level rise, there may be a new ball game on the horizon. The past is no longer prologue; in fact, the past may become just another scenario among many to understand sensitivity to change and to plan for what might happen. The Netherlands pioneered a scenario-based analysis to explore a variety of possible conditions leading to a strategy that focused on incremental adaptation to change as it happens rather than committing to a solution based on a ‘most likely’ project life projection. Scenario-based adaptation may be a more realistic approach to coping with long-term uncertainty and short-term funding habits.

Communities also face multi-hazard risk. While major hazards such as floods, earthquakes, and high winds draw the most attention, others like heat and drought are more difficult to characterize and prepare for. It has become clear that most hazard events are, in reality, compound events, the conditions and impacts influenced by multiple sources. Houston flooding losses during Harvey in 2017 were influenced by storm surge, extreme rainfall, years of minimal land use management, and lack of an integrated plan or response capability.

Yet as we move into the future, our funding strategies seem to be digressing. The massive Coastal Texas project currently proposed to Congress is wrestling with the idea of funding over a period of decades rather than the post-Katrina model. Similar to New Orleans, Congress has continued to put a 100-year glass ceiling on the design of the new system and continued with project life, with only limited consideration of adaptation.

Given the apparent inevitability of non-stationarity becoming an increasing challenge, we are faced with transforming analysis and practice to accommodate reduced dependance on probabilistic approaches and to forecasting the likelihood of rare events. With a decline in the viability of probabilistic forecasting, we would be faced with a need to set new decision criteria. Any ‘risk’ appraisal will contain more and more uncertainty. It may make more sense to define tolerable losses and focus on the degree of mitigation necessary to prevent that level of loss regardless of the likelihood. This was the approach adopted by Sir Hermann Bondi in his 1966 report to the UK Government appraising the need for a Thames River Barrier (Baxter 2005):

The stakes are very high indeed and loss of life could be considerable even if only a small part of the operation failed to function as intended. As well as the risk of drowning, [a] really large disaster has effects that can go on for decades, that can give such a jolt to the whole economy that the loss in national income is quite strangling and can, through the death of a sufficiently large number of highly qualified people, immeasurably impoverish the life of everyone in the country. Such a disaster could well be considered unbearable in the sense that we would be foolish to contemplate letting it come to pass without taking every reasonable avoiding action. In the nature of things, disasters of this kind are unique. I think it is just as incumbent on the government to prevent such a disaster as it is, for example, to prevent an enormous outbreak of smallpox. The precise probability of the disaster occurring then becomes a relatively unimportant matter.

The bottom line seems to be that we have not substantially crawled out of the comfortable rut of short-term process optimization, investments, and actions. While significant advances are being made in R&D, the efforts are mostly improving or enhancing the approaches in practice. Far too little attention is being paid to potential disruptive changes looming in the future.

Nationally, we lack a sense of urgency for establishing robust plans and investments to shift the emphasis from recovery to resilience. Coastal Texas is one major initiative, but without rigorous consideration and inclusion of the uncertainty that the project may face, it may well become a first cousin of the Venice Barrier project – a massive, but under-designed, project to protect the Venetian Lagoon from high tides and rising sea levels. Perhaps the most significant barrier is not in development or engineering, it is the inconsistency of policy for the future. Policy changes can no longer be steered via the rear-view mirror. The challenges of the future can only be effectively addressed if policy is an enabler of the changes in knowledge and practice necessary to deal with what may come as well as what is here.

Perhaps the most difficult paradigm shift is synchronizing the evolution of practice and policies to meet the spectrum of challenges of the future. Routine use of scenarios, sensitivity, and adaptation that embrace physical, social, and environmental aspects of communities appear to be one viable approach to cope with this challenge. However, the transition to incremental design, construction, funding, and governance will be a major disruptive force given the penchant for short-term thinking, election cycle investments, and reluctance to change. Perhaps the time has come for the long-needed but ignored concept of a national strategy for dealing with natural hazards.

Given reasonable estimates of the likelihood of events and the magnitude of associated losses, risk evaluated for scenarios provides a valuable metric for planning. The challenge is the availability of ‘accurate’ information. Even the most comprehensive data and models contain uncertainty, some from incompleteness, some from validity, and some from uncertainty in natural processes. A significant component of the post-Katrina risk analysis of New Orleans was a quantitative assessment of uncertainty. The results were startling, risk estimates at a given location included more than an order of magnitude of uncertainty (USACE 2008). Given the level of effort in the New Orleans analysis, the uncertainty in estimates from less comprehensive studies is concerning.

When the processes themselves, both natural and societal, morph over time we are challenged with a greater issue, non-stationarity, where the assumption that the past is a viable basis for forecasting the future is no longer true. While we are reasonably adept at ‘normalizing’ past data to current conditions, our toolbox is empty when it comes to forecasting the future under the deep uncertainty looming in the future (Cox 2012).

When risk estimates for the future are no longer relevant, we need different methods. Scenarios representing a range of potential future conditions provide a means to understand the sensitivity to different types and magnitudes of change. A new twist is not labeling scenarios with an expected timeframe or ranking them with respect to probability. Adaptation provides a means to implement solutions as conditions change, essentially just enough and just in time. Gone is the need to forecast long-term futures and fund grand schemes with current budgets. The third piece is policy. Governance must be supportive of new approaches, including the mechanisms for planning and funding ‘projects’. Policy and guidance need to evolve toward regional assessments and integrated planning that considers change not just in the physical world, but also in society. In short, we need an integrated natural hazards strategy that anticipates change and enables adaptation.

All relevant data are included in the paper or its Supplementary Information.

The authors declare there is no conflict.

ASCE
(
2007
)
The New Orleans Hurricane Protection System: What Went Wrong and Why; A Report
.
Reston, VA
:
American Society of Civil Engineers
.
Baxter
P. J.
(
2005
)
The east coast Big flood, 31 January–1 February 1953: a summary of the human disaster
,
Philosophical transactions of the Royal Society, A
,
363
,
1293
1312
.
DWR
(
2007
)
A California Challenge – Flooding in the Central Valley
.
State of California, Sacramento
:
Department of Water Resources
.
NRC
(
2009
)
The New Orleans Hurricane Protection System: Assessing Pre-Katrina Vulnerability and Improving Mitigation and Preparedness
.
Washington, DC
:
National Academies Press
.
Robinson
M. F.
(
2004
)
History Of The 1% Chance Flood Standard.’ Background Paper. Gilbert F. White National Flood Policy Forum, September 21–22, 2004
.
Washington, DC
:
National Academies Press
.
Stakhiv
G. Z.
&
Hiroki
K.
(
2021
)
Special issue for UN HELP: ‘Water infrastructure planning, management and design under climate uncertainty
,
Water Policy
,
23
,
1
9
.
Tsimopoulou
V.
,
Kok
M.
&
Vrijling
J. K.
(
2015
)
Economic optimization of flood prevention systems in The Netherlands
,
Mitigation and Adaptation Strategies for Global Change
,
20
,
891
912
.
USACE
(
2007
)
Performance Evaluation of the New Orleans and Southeast Louisiana Hurricane Protection System, v1 - Executive Summary and Overview
.
Washington, DC
:
US Army Corps of Engineers
.
USACE
(
2008
)
Performance Evaluation of the New Orleans and Southeast Louisiana Hurricane Protection System, v8 – Engineering and Operational Risk and Reliability Analysis
.
Washington, DC
:
US Army Corps of Engineers
.
USACE
(
2009
)
A General Description of Vulnerability to Flooding and Risk for New Orleans and Vicinity: Past, Present, and Future
.
Washington, DC
:
US Army Corps of Engineers
.
Wing
O. E. J.
,
Lehman
W.
,
Bates
P. D.
,
Sampson
C. C.
,
Quinn
N.
,
Smith
A. M.
,
Neal
J. C.
,
Porter
J. R.
&
Kousky
C.
(
2022
)
Inequitable patterns of US flood risk in the anthropocene
,
Nature Climate Change
,
12
(
2
),
156
162
.
Woolley
D.
&
Shabman
L.
(
2008
)
Decision-Making Chronology for the Lake Pontchartrain & Vicinity Hurricane Protection Project,’ Final Report to Headquarters
.
Washington, DC
:
U.S. Army Corps of Engineers
.
333
.
This is an Open Access article distributed under the terms of the Creative Commons Attribution Licence (CC BY-ND 4.0), which permits copying and redistribution with no derivatives, provided the original work is properly cited (http://creativecommons.org/licenses/by-nd/4.0/).