Abstract
Global assessments show profound impacts of human activities on freshwater systems that, without action, are expected to reach crisis point in the 2030s. By then, the capacity of natural systems to meet rising demands for water, food, and energy could be hampered by emerging signals of anthropogenic climate change. The hydrological community has always been solution-orientated, but our generation faces perhaps the greatest array of water challenges in human history. Ambitious programmes of research are needed to fill critical data, knowledge, and skills gaps. Priorities include filling data sparse places, predicting peak water, understanding the physical drivers of mega droughts, evaluating hyper-resolution models, managing compound hazards, and adjusting water infrastructure designs to climate change. Despite the opportunities presented by big data, we must not lose sight of the deep uncertainties affecting both our raw input data and hydrological models, nor neglect the human dimensions of water system change. Community-scale projects and international research partnerships are needed to connect new hydrological knowledge with most vulnerable communities as well as to achieve more integrated and grounded solutions. With these elements in place, we will be better equipped to meet the global hydrological challenges of the 2030s and beyond.
INTRODUCTION
The future is much closer than it used to be. A decade ago, warnings about an approaching global food, energy, and water crisis were a cloud on the distant horizon. At that time, projections of economic and population growth by the 2030s suggested a 50% rise in the demand for food and energy, with a 30% increase in water requirements, while having to contend with the early signs of anthropogenic climate change (Beddington 2009). Other drivers of change to national security, social cohesion, economic prosperity, and environmental sustainability are also emerging. For instance, ‘megatrends’ in public debt, information and communication technology, urbanization, and the rise of the individual, all shape global patterns of natural resource consumption and create new vulnerabilities (KPMG 2014; Wilby 2017). Two thirds of the human population could experience progressive increases in drought conditions under global warming (Naumann et al. 2018). With greater interconnectedness of economies and supply chains, we are all potentially vulnerable, whether directly or indirectly, to extreme hydrological events.
Meanwhile, numerous global assessments are cataloguing the profound impacts of human activities on freshwater systems. These include measurable groundwater depletion across swathes of North Africa, North America, the Middle East, and South Asia (Rodell et al. 2018); widespread river regulation, watershed disturbance and contamination (Vörösmarty et al. 2010). Evolving and/or intensifying threats to freshwater biodiversity also come from changing climates; global internet commerce and spread of non-native species; infectious diseases; harmful algal blooms; expanding hydropower; emerging contaminants; engineered nanomaterials; microplastic pollution; light and noise; freshwater salinization; declining calcium; and cumulative stressors (Reid et al. 2019: 849).
As the ‘perfect storm’ of the 2030s forecasted by Beddington (2009) closes in, there appears to be greater urgency about tackling the climate–food–energy–water nexus through policy and planning at various levels (e.g. Bazilian et al. 2011; Conway et al. 2015; Yang et al. 2016). In 2015, the United Nations Sustainable Development Goals succeeded the Millennium Development Goals. The new global vision set targets for 2030 based on 17 themes that are highly interdependent and intrinsically water-related (Harmancioglu 2017). Goal 6 is to ensure availability and sustainable management of water and sanitation for all. Yet, more impetus has been added by the realization that global mean warming of 1.5 °C could be reached as soon as 2030 with concomitant risks to human and natural systems (IPCC 2018). Governments and institutions now find themselves under pressure to acknowledge a ‘climate emergency’ and to commit to much more ambitious mitigation pathways to achieve net zero emissions.
How should the hydrological community respond to such major imperatives? What new information and research are needed to support a concerted global effort to manage the water challenges of the 2030s and beyond? Traditionally, we have arranged our knowledge within thematic silos that have changed little over the last 25 years (Table 1). However, greater emphasis is now needed on preparing for hydro-system changes that could fall outside historic variability (Wagener et al. 2010). This Comment Paper makes a case for a more integrated, solution-orientated approach to the global water-related challenges identified above (following Carpenter et al. 2009; Pahl-Wostl et al. 2013; Green et al. 2017a). These thoughts were shared in a closing address to the British Hydrological Society National Meeting in September 2018, then updated to reflect later developments.
Major themes addressed by the fourth (1993) and thirteenth (2018) British Hydrological Society National Symposia
4th BHS National Symposium 1993 . | 13th BHS National Symposium 2018 . |
---|---|
Water Resources | Droughts, Low Flows and Resources |
Environmental Aspects | Advances in Hydroecology |
Advances in Sediments and Habitats | |
Information Systems | Advances in Hydrometric Data |
Floods | Flood Hydrology |
Modelling Developments | Advances in Modelling |
4th BHS National Symposium 1993 . | 13th BHS National Symposium 2018 . |
---|---|
Water Resources | Droughts, Low Flows and Resources |
Environmental Aspects | Advances in Hydroecology |
Advances in Sediments and Habitats | |
Information Systems | Advances in Hydrometric Data |
Floods | Flood Hydrology |
Modelling Developments | Advances in Modelling |
Research opportunities surrounding six solution sets are identified. These are based on the personal reflections of an applied hydroclimatologist with the good fortune of working in five continents over the last 30 years. A call is made for an agenda aimed at (1) filling data sparse places; (2) predicting peak water; (3) understanding the physical drivers of mega droughts; (4) evaluating hyper-resolution models; (5) managing compound hazards; and (6) adjusting engineering standards (under non-stationary conditions). These are followed by a section that reflects on ways of working together that strengthen multidisciplinary integration and research impact. Throughout, the emphasis is on supporting the most vulnerable communities. It is hoped that these suggestions will inform wider discourse about the future direction and priorities for hydrological science, including associated training needs. There is also scope for the closer alignment of our research and development programmes with complementary disciplines.
FILLING DATA SPARSE PLACES
Quality assured hydrological data are needed to adaptively manage resources, calibrate remotely sensed observations, build, and test models. Nonetheless, there are vast tracts of Earth lacking ground measurements of fundamental water balance components (i.e. precipitation, evapotranspiration and changes in ice, lake/wetland, soil, or groundwater storage). Network densities are particularly sparse in the Arctic, sub-Saharan Africa (apart from South Africa), central Asia, the Pacific Islands, and South America. High-altitude regions and fragile states are especially under-represented. Even where there are data, records may be incomplete due to lack of resources for personnel and equipment or because of conflict; there may be homogeneity issues due to site, instrument, or observer changes; data may be corrupted at any point in the information flow or held in inaccessible formats (Wilby et al. 2017). For example, temporal variations in the rating curves used to relate channel cross-section, flow depth, and discharge are a source of uncertainty even for well-resourced agencies (see Slater et al. 2019).
A range of initiatives are needed to fill data gaps. Equipment can be installed in strategically important places – so-called sentinel locations – where hydrological changes are most likely to be detected (Fowler & Wilby 2010). For instance, there is evidence that high elevation sites are warming more rapidly than the global mean (Pepin et al. 2015), but this may only hold true up to 5000 m (Gao et al. 2018). A spring 2019 expedition established automatic weather stations on Mount Everest, including the highest station in the world (at 8,430 m on the Balcony of the Southeast Ridge) (Wilkinson 2019). Clearly, the logistical, physical, and technical challenges that must be faced in such environments are extraordinary. However, more of these data are urgently needed to deepen understanding of the behaviour of the Hindu Kush Himalaya (HKH) water towers that ultimately support the livelihoods, water, and energy needs of more than 2 billion people (Immerzeel et al. 2010).
It is generally accepted that some of the most data sparse regions are also the most vulnerable to hydroclimatic hazards – globally, data are least available where they are most needed. Temporal scaling can extrapolate sub-daily, even sub-hourly, precipitation intensities for engineering design, where daily data exist (e.g. Courty et al. 2019). Alternatively, geostatistical techniques might be used to blend fragments of in situ data with remotely sensed information (e.g. Wilby & Yu 2013), including proxies for hydrological variables (e.g. Najmaddin et al. 2017) to run conceptual and/or distributed hydrological models (e.g. Samaniego et al. 2011). Basic ‘hot-spot’ sensitivity analyses may then identify communities most at risk or inform the design of future hydrometeorological network expansions. Comprehensive assessments are also needed to compare temporal and spatial variations in the skill of various remotely sensed, re-analysis, model- and ground-based hydro-meteorological data (e.g. Sun et al. 2018; Pritchard et al. 2019). Even then, post-processing and bias correction may be required to convert gridded products to point data. The development of high-resolution gravity-based methods could literally ‘weigh’ water changes at whole catchment scales, thereby perhaps even supplanting traditional river gauging techniques (e.g. Gouweleeuw et al. 2018). Community-wide efforts are needed to build hyper-resolution (∼1 km) models for monitoring terrestrial water, energy, and biogeochemical cycles (Wood et al. 2011).
Considerable data assets are still held in paper archives that have yet to be digitized. Recent projects demonstrate how well-coordinated public engagement (‘citizen scientists’) can recover otherwise dormant data. For example, the digitization of 20 years of nineteenth century hourly weather data from the summit of Ben Nevis revealed near-zero atmospheric humidity episodes at high altitudes (Burt & Hawkins 2019). Similarly, the rediscovery of very long sleet and snowfall record for the greater London area by Murphy et al. (2019) showed that a widely accepted narrative of wetter-winters in England and Wales since 1766 is, most likely, an artefact of snowfall under-catch in the pre-1860s. The upward winter precipitation trend becomes insignificant when corrected for snowfall. Hence, even in data-rich regions, old data viewed through a modern lens can sometimes challenge deeply held convictions about hydrological change (or not).
PREDICTING PEAK WATER
There are widespread concerns about growing water scarcity and associated risks to global food security (Falkenmark 2013), including the vulnerability of the international trade in staple crops to hydroclimatic shocks (d'Amour et al. 2016). Peak non-renewable water occurs in systems where rates of groundwater pumping or contamination lead to a maximum of production followed by a decline (analogous to the concept of peak oil). This stage has already been passed in the aquifers of Ogallala and California's Central Valley, the North China Plains, and in Andhra Pradesh, Rajasthan, and Tamil Nadu, India (Gleick & Palaniappan 2010). One estimate suggests that present global groundwater withdrawals are equivalent to a footprint that is 3.5 times the actual area of the aquifers and that 1.7 billion people live in regions where groundwater resources are under threat (Gleeson et al. 2012: 197).
One third of humanity depends on water supplies from large glacierized catchments (Huss & Hock 2018). Hence, global shrinkage of glaciers and associated changes in meltwater and runoff will have profound implications for water supplies, irrigation, and the energy security of downstream communities. In parts of the Tien Shan, Central Asia, meltwater from glacial stores could culminate in the 2020s (Sorg et al. 2014). However, there is uncertainty about the exact timing of peak meltwater in most regions. This partly stems from a lack of long-term data as noted above. Peak water also depends on the size of the catchment and relative contributions of ice, snow, thawing permafrost, and rainfall to total flow. Climate model and emissions scenario uncertainty matter too, especially by the end of the century when projections diverge according to the mitigation pathway followed (e.g. Lutz et al. 2013; Shannon et al. 2019).
Overall, peak meltwater is expected to occur later in larger catchments (Huss & Hock 2018) and on the eastern (rather than western) slopes of the Tien Shan-Pamir-Karakoram mountain complex (Luo et al. 2018). Dynamical ablation models tend to show peak meltwater in later decades than mass balance models, because they can sustain snow and ice flux from higher elevations to the melt zone for longer, rather than simply ablating as a static block (Kure et al. 2013; Ohara et al. 2014). However, both types of ablation model are confounded by large deficiencies in climate model representations of physical processes over mountainous regions, including temperature lapse rates, land surface feedbacks, and precipitation (Dobler et al. 2012). More research is needed to reconcile differences between glacier melt models within the significant constraints of input data and parameter uncertainty.
Although observed ice loss in the western Himalayas has accelerated over the last 40 years, it is too simplistic to attribute such trends entirely to rising temperatures; atmospheric deposition of black carbon (sooty particulates) and regional precipitation changes are also important drivers (Lutz et al. 2014; Maurer et al. 2019). Reliable projections of meltwater contributions from individual basins are hindered by the lack of information on seasonal snowpack, ice and hydrometeorological changes at high elevations, a partial understanding of the significance of black carbon and debris cover to ice melt, and by low confidence in climate model projections of precipitation changes, especially over the HKH. There are opportunities to develop robust data gathering techniques aimed at calibrating models, such as extracting snowline observations from remote imagery as a mass balance indicator (e.g. Barandun et al. 2015). More information on precipitation changes above 5,000 m is urgently needed, as well as parsimonious ablation models that include important processes such as sublimation (e.g. Wimmer et al. 2009).
The timing of peak water may be uncertain, but earlier, more rapid melt of snowpack followed by diminished summer flows is widely observed and consistently produced by model simulations under rising air temperatures. Seasonal changes in the timing and volume of meltwater have implications for the safety and operation of downstream hydropower facilities. Hence, procedures are being developed to seasonally forecast snowpack runoff to support the management of reservoir storage (Archer & Fowler 2008; Dixon & Wilby 2016; Apel et al. 2018). Forecast skill rests on links between leading modes of climate variability, such as ENSO, and regional hydrological hazards (Emerton et al. 2017). Lagged relationships between teleconnection patterns, winter precipitation, and spring snowmelt can then be exploited for inflow forecasting (e.g. Dixon & Wilby 2019). Generally, modest levels of skill mean that such techniques are best regarded as outlooks of above, near, or below average conditions, rather than precise forecasts. Further work is needed to account for non-stationarity in predictability, due to Arctic amplification causing poleward movement of the jet stream; expansion of the Hadley cell; interactions between climate modes; or variations in peak amplitude, duration, timing, and spatial patterns of sea surface temperature (SST) anomalies.
UNDERSTANDING THE PHYSICAL DRIVERS OF MEGA DROUGHTS
During a recent water management class, an undergraduate student asked why the Atlantic Meridional Oscillation (AMO) varies with a typical periodicity of 60–80 years. [If anyone has the answer, please let me know.] The question is perceptive because the AMO is linked to multi-decadal variations in the climate of northwest Europe (Sutton & Dong 2012). During warm (cold) phases, summer rainfall is higher (lower) than average. Hence, anticipation of the AMO suggests foresight of rainfall anomalies – an expected return to cooler Atlantic SSTs in the 2020s and 2030s could imply a greater likelihood of severe European summer droughts (as in the 1970s and 1980s). Such information is highly pertinent to the UK water industry, given that the latest guidance requires companies to evaluate contingencies for a challenging but ‘plausible worst case’ drought that could exceed the coping capacity of existing and planned water supply systems (Environment Agency and Natural Resource Wales 2018).
Specification of a plausible extreme (mega) drought for resilience testing – here with notional 200 year return period – is a technically demanding task. Although data on mega drought occurrence have improved thanks to multi-centennial reconstructions (e.g. Cook et al. 2015; Wilby et al. 2015), higher-resolution information is needed for water resource systems modelling. This gap is being filled by stochastic weather generators and extreme value analysis, but there are concerns about the physical realism of the underlying models. The physical drivers of drought form a continuum from the seasonal dry-spell (storm track variation), through the multi-season episode (blocking), to the multi-year severe event (ENSO) and the drought-rich decade (AMO) (Pulwarty & Sivakumar 2014). So, is it reasonable to assume that a 2-year return period event can be drawn from the same population as a 200-year mega-drought?
Studies of persistent droughts in Australia suggest different processes behind the Federation (∼1895–1902), World War II (1937–1945) and Millennium (2001–2009) events (Verdon-Kidd & Kiem 2009). Moreover, the various causes of drought could interact in ways that yield even more severe events (van Dijk et al. 2013). Multiple (not just single) modes of climate variability could also be driving severe droughts across Europe. For instance, Ionita et al. (2012) link inter-annual to decadal variability in European summer drought to three modes of SST anomalies; a long-term trend associated with warming over all oceans; interactions between the inter-annual ENSO and the Pacific Decadal Oscillation; and the AMO. Similarly, Folland et al. (2015) suggest that multi-annual droughts in the English lowlands could be linked to combinations of ENSO, North Atlantic SSTs, the Quasi-Biennial Oscillation, solar and volcanic forcings, and the AMO. More dynamical ocean–atmosphere modelling of multiple, interacting drivers is needed to uncover the causes of persistent rainfall deficits in the past and to explore how these might change in the future. Once a multi-year drought ends, even the rainfall-runoff relationship can change (Saft et al. 2015), implying that historical yields will not be reliable indicators of future water availability under climate change (Saft et al. 2016).
Hydrologists and climatologists also have shared interests in how soil-moisture feedbacks might amplify the severity of heatwaves and droughts under climate change. It is recognized that precipitation/soil moisture biases in climate models can lead to unrealistic partitioning of surface energy fluxes between land and atmosphere. For example, a dry bias in summer rainfall would lead to overly strong warming of extreme temperatures. Hence, by constraining climate model ensembles with observed correlations between summer rainfall and temperature, it is possible to narrow the range of uncertainty in future projections by excluding improbable models (Vogel et al. 2018). Elimination of such outlier (hot) models also reduces the projected multi-model median. Various global constraints on future changes in the hydrologic cycle have been considered, including tropospheric humidity, equilibrium, and transient mean precipitation (Allen & Ingram 2002), as well as the assumed mass balance between evaporation and precipitation (Liepert & Previdi 2012; Liepert & Lo 2013). Further work is needed to identify physically consistent, observation-based constraints to uncertainty in regional hydrological projections that are analogous to tests of climate model realism (e.g. McSweeney et al. 2015). A deeper understanding of land surface feedbacks could also be deployed in the design of green and blue infrastructure to counter urban heatwaves (e.g. Žuvela-Aloise et al. 2016; Gunawardena et al. 2017).
EVALUATING HYPER-RESOLUTION MODELS
The term ‘hyper-resolution’ was used earlier in relation to global water cycle monitoring at km-scales. The resolution of hydrological models is improving across all environments, assisted by progress in remote sensing, including low cost satellite systems and drone-based technologies (National Academies of Sciences Engineering and Medicine 2018). Here, for illustrative purposes, the focus is on urban flood simulation models operating at ∼1–2 m horizontal and ±0.25 m vertical resolution at city-scales (e.g. Yin et al. 2015; Green et al. 2017b). Such models use rainfall forecasts to generate street-level maps of surface water flooding (Henonin et al. 2013) or to identify hotspots of vulnerability in terms of poor accessibility by emergency responders during flood episodes (e.g. Coles et al. 2017). Nowcasting applications assimilate rainfall forecasts every few minutes to update simulations of flood depth and extent. This raises demanding questions about how to appropriately test distributed models, given the large uncertainty in initial conditions, inputs, parameters, and outputs? Such challenges are common to all models of natural systems (Oreskes et al. 1994).
Beven (2019a, 2019b) proposes that model evaluation should be approached as a form of hypothesis testing. However, rather than test in a formal statistical sense, a case is made for assessing ‘fitness-for-purpose’. For an operational flood forecasting system, this means appraising model behaviour under assumed boundary conditions. The fitness of a property-level forecast can be evaluated against a wide range of data seized post hoc (e.g. social media reports of flooding, recorded flood heights on buildings, street furniture or vehicles, insurance claims, remotely sensed data, and aerial surveys). For instance, Yu et al. (2019) compare distributions of simulated pluvial flooding across London with emergency responder data on the day of the EU referendum floods. Likewise, Muthusamy et al. (2019) combined high-resolution inundation data collected by a drone with property-level damage data for Cockermouth, Cumbria to evaluate fluvial plus pluvial flood simulations of storm Desmond in 2015. This required special permission for emergency response by the Civil Aviation Authority to fly over congested areas, beyond the visual line of sight and under extreme weather conditions. Understandably, such data sets are relatively rare, so there is scope to build libraries of data for model evaluation that could be shared among research teams.
Regardless of the domain of hyper-resolution hydrologic modelling (whether global- or street-level), there will always be a need for scale-dependent parameterization of sub-grid processes (Beven et al. 2015). Even at the scale of a high-resolution flood model, there is sub-grid heterogeneity in topography, soil properties, land cover, and vegetation that determine local infiltration rates, storage volumes, and flow pathways. In addition, precipitation amounts are typically interpolated between gauges to the grid-scale, whereas buildings and roads are taken as a snapshot in time. Hence, knowledge of these non-continuous land-surface and meteorological properties is subject to considerable epistemic uncertainty. How best to represent these variations at the catchment scale remains a fundamental research area.
In the meantime, highly localized model output can be made available to stakeholders who have the resident knowledge needed to identify errors (Beven et al. 2015). For example, emergency responders will no doubt be aware of the parts of the road network that regularly flood. Confidence star ratings can be applied to model representations of flood depth/area along with critical segments. Combined model uncertainty may also be represented as probabilistic visualizations of maximum inundation area for specified annual exceedance probabilities (Leedal et al. 2010). Given the danger of inconsistent analysis and messaging, there have been repeated calls for Codes of Practice to formalize the treatment of uncertainty (Pappenberger & Beven 2006), communication of flood (Demeritt & Nobert 2014), and drought risk (Climate Outreach 2016). Research is still required on how best to convey highly uncertain model outputs to different stakeholders, in consistent ways, recognizing that preferred formats will be context- and decision-maker-dependent.
MANAGING COMPOUND HYDRO-HAZARDS
Some of the most deadly and costly hydrological catastrophes are due to the coincidence of hazards in space or time. Traditional risk assessments viewed hazards one at a time; now, it is recognized that the likelihood of very high-impact events arising from combinations of climate drivers and/or hazards may have been underestimated (Hillier et al. 2015). For example, summer atmospheric blocking systems near Europe can produce extreme heat, drought, wildfire, and poor air quality as witnessed during the deadly 2010 Russian heatwave (Shaposhnikov et al. 2014). The joint occurrence of storm surge and fluvial and coastal flooding is perhaps the most familiar compound hydro-hazard as suffered during Hurricane Sandy, USA 2012, Typhoon Haiyan, Philippines 2013, and Cyclone Idai, Mozambique 2019. Other threats from hitherto rare hazard combinations, such as extreme heat following a hurricane, are only beginning to be investigated (Matthews et al. 2019). Whether through rising ambient temperatures, higher mean sea levels, more intense rainfall, or severe storms, climate change is generally expected to increase the likelihood of compound hazards (Wahl et al. 2015; Zscheischler et al. 2018).
Compound hazard definitions also include events with near temporal coincidence but manifested at separate locations. In addition, there are cascading disasters that arise from a single hazard which then triggers a chain of events resulting in large-scale impacts on lives and livelihoods (Cutter 2018). An example of the former is widespread, multi-basin fluvial flooding as witnessed in England and Wales during autumn 2000. At the height of the floods, nearly 20% of the drained area of these countries was recording near simultaneous annual maximum flows (De Luca et al. 2017). Moreover, peak flows tend to follow very severe gales, resulting in spatially distributed yet near coincident wind and flood damage. An example of a cascading hazard would be the extreme rainfall intensities in summer 2016 which led to flash-flooding and severe debris flow impacts on the small rural town of Braunsbach, Germany (Bronstert et al. 2018). This study highlighted the challenges of quantifying the multiple and cascading drivers behind the extreme event. Another illustration would be the link between heavy rainfall, urban surface water flooding, and cholera outbreaks in Senegal (de Magny et al. 2012).
Compound hazard analysis suggests that worst case years used for insurance purposes in catastrophe models are likely to be more costly than previously thought. There may also be ‘hot-spots’ where different types of hydrological extreme tend to recur at the same place (Collet et al. 2018). The UK National Flood Resilience Review and the UK Climate Change Risk Assessment (both in 2016) called for improved resilience to failures of interdependent critical networks (e.g. electricity, information technology, and transport) in high-risk locations. Taken together, there is a need to deepen understanding of the interrelationships between hazards and natural/human-induced environmental change. For instance, Gill & Malamud (2017) systematically identify 18 anthropogenic process types with 64 interactions that could potentially trigger/influence hazards. Hydrological hazards include vegetation removal or road construction increasing the susceptibility of slopes to landslide after heavy rain and or groundwater abstraction leading to more depressed river flows during a drought. Their matrices show the value of integrating anthropogenic processes within multi-hazard frameworks for more holistic location-specific screening of factors and thereby, disaster risk reduction.
As recognized by a 2019 call by UK Research and Innovation, there are many open questions around (1) the underlying drivers of compound hazards, (2) mechanisms of the cascade (or ‘risk contagion’) between drivers and receptors, as well as (3) options for improving resilience and managing multi-hazards. More research needs to be directed at low- and middle-income countries where there are relatively high systemic risks, and socio-economic vulnerabilities to compound hazards. To support national assessments, new spatially consistent analytical frameworks will be required to account for variations in hydroclimatic modelling uncertainty (e.g. Visser-Quinn et al. 2019). Ultimately, such developments should lead to improved hazard forecasting, civil contingency planning, and avoided damages.
ADJUSTING ENGINEERING STANDARDS
Climate change is expected to intensify extreme rainfall and raise global mean sea level so, without adaptation, damage from fluvial, pluvial, and coastal flood risk are expected to rise (Hirabayashi et al. 2013; Hallegatte et al. 2013). There are many enabling measures for managing these risks, such as forecasting, contingency planning for disasters, insurance, and land use zoning to reduce exposure. Site-specific interventions include new flood defence assets, upgrading resistance and resilience of existing infrastructure, modifying operating rules of flood control reservoirs, retreating from hazardous areas, periodic review, and adaptive management (Hallegatte 2009; Wilby & Keenan 2012). Perhaps the most technically demanding option surrounds the adjustment of engineering standards to reflect evolving and projected hydrological conditions. This is especially contentious because of the methods of economic discounting applied to costs and benefits, as well as the low confidence in regional climate projections over the design life times of new infrastructure (Kundzewicz & Stakhiv 2010). New ways of working with non-stationary information must also be deployed (Serinaldi & Kilsby 2015).
Nonetheless, a few agencies are already providing look-up tables and guidance for engineers that reflect climate change (e.g. New York City Panel on Climate Change 2013; United States Army Corps of Engineers 2013; Netherlands Ministry of Infrastructure and Environment 2014; Asian Development Bank 2018; International Hydropower Association 2019). Some refer to ‘adjustments’, others to ‘allowances,’ or ‘flood-risk reduction standards’. Some guides are intended to shape asset design, others for sensitivity (or ‘stress’) testing the performance of options. Latest advice on adapting to climate change in England provides tables of upper, central, and lower allowances for extreme rainfall intensity, peak river flow, and relative mean sea level, by region and period (2020s, 2050s, and 2080s) (Environment Agency 2016). Others advocate climate change allowances based on catchment type (Broderick et al. 2019) or precipitation mechanism/duration (Fowler & Wilby 2010). Ideally, global standards would emerge such that new infrastructure is built using consistent methodologies while respecting the deep uncertainties and regional variations in hydroclimatic and geotechnical risks. Even then, there may be a reluctance to adopt guidance depending on local appetite for risk and/or availability of resources.
Nonetheless, translation of climate model information into engineering standards is a non-trivial matter and can introduce considerable methodological uncertainty. First, judgements must be made about which emission(s) scenario, climate model ensemble(s) and part(s) of the ensemble range should be used. This depends on how precautionary the design must be, which eventually affects the cost of a project. Second, design variables must be extracted from climate model archives then post-processed to give the index severity, duration and return period as mandated by national design standards for specified structures. This involves decisions about whether to apply bias corrections to the climate model information to better match local measurements, as well as choices about the extreme value distribution. More elaborate analysis is required to derive sub-daily statistics from daily climate model output (e.g. Herath et al. 2016). Third, the resulting ‘change factors’ have to be expressed with reference to an agreed baseline, then aggregated spatially and rounded mathematically. This is necessary to avoid the impression of undue precision. Finally, guidance material and worked examples must be developed to help practitioners apply the tables correctly and consistently, while allowing audit by competent authorities. Ideally, the whole framework of activities would be transparent and subject to periodic review, in line with evolving knowledge.
The above description is an abridged version of a typical protocol for adjusting individual project designs. Ideally, these elements are arranged within an adaptive management framework that comprises a mix of hard, soft, and environmental measures. Regional- to national-scale programmes – such as the Thames Estuary 2100 (Ranger et al. 2013) and the Netherlands Delta Commission (Katsman et al. 2011) – involve lengthy, deliberative processes that engage with multiple stakeholder interests. Other examples of adaptive management pathways, include sequencing of public water supply augmentation options in Adelaide (Beh et al. 2015) and long-term water management in the Rhine Delta (Haasnoot et al. 2013). Although this strategy is appealing, highly contested socio-ecological trade-offs can be encountered, such as how to reallocate scarce water from irrigators to environmental flows. This is because the required amount of water to achieve an agreed benefit is uncertain even for individual species under present climate conditions, let alone for an entire freshwater ecosystem under climate change (Gell et al. 2019). A yet greater challenge is how to implement allowances for climate change where there are limited baseline data, low technical capacities, and/or weak governance.
WORKING TOGETHER
What we study is important but so too, are how, where and with who we research. This penultimate section offers a few suggestions about ways of working together that strengthen the multidisciplinary integration and impact of hydrological research.
To begin with, it is helpful to examine motivations. As elaborated before, there are many grand challenges, so every hydrologist has a stake in what happens next. However, there is a need for circumspection and realism about what individuals can contribute to solving significant global issues. Adams et al. (2015: 52) share helpful advice about ethical ways of working that include principles of integrity, transparency, humility, and collaboration. Their definition of humility is modified slightly to presenting [ourselves] as no more or less than [we are], not promising more than can be delivered, nor obscuring an underlying reality of uncertainty. Jennings et al. (2009) similarly provide views on the moral values and obligations set before everyone working in water resource management. Their ethical code stems from a sense of shared purpose, of working with nature, and of balance between traditional and new technologies. Their motivations are driven by care for the security, safety, and shared interests of both people and the environment.
Calls for closer integration of hydrology with other disciplines (especially social sciences and ecology) have been made before (e.g. Hannah et al. 2007; Krause et al. 2011; Sivapalan et al. 2012; Baldassarre et al. 2013; Montanari et al. 2013; Bierkens 2015; Di Baldassarre et al. 2015). We work in a dynamic landscape of research questions that are tackled by a variety of disciplines, and in which scientific findings are forging new disciplinary configurations (Vugteveen et al. 2014). For example, some claim that the ecological degradation of urban streams can only be reversed by integrating ecological research with the social, behavioural, and economic investigation (Walsh et al. 2005). Likewise, hydroclimatology is an emerging co-discipline that has brought valuable insights about the atmospheric drivers and land-surface conditions behind extreme hydrological events and their long-term behaviour at catchment to global scales (McGregor 2019). An alliance of hydroclimatology with hydrogeomorphology further offers a framework for interpreting links between modes of climate variability, river channel, and fluvial habitat changes (Slater et al. 2019). This land–water–eco-management nexus has stimulated much research on systems modelling and decision support. However, there are still opportunities for deeper integration of human–water dimensions in thematic areas such as sustainable development, hazards management, public–private ownership, and governance (Bakker 2010; Xu et al. 2018; Gell et al. 2019). In short, solution-orientated hydrology must treat human activities as endogenous to water system dynamics (Gober & Wheater 2015).
Multidisciplinary water research and knowledge exchange can take various forms, but genuinely co-productive research design is always a good starting point. In this way, the nascent research team negotiates a set of research questions that are meaningful to the practitioner and scientist alike – ideally, the communities served have a voice too. For instance, community champions can be invaluable sources of local knowledge for investigating flood impacts and coping strategies within low-income neighbourhoods (Gough et al. 2019). Early participation of such parties is essential for effective dissemination and uptake of research results as well. Some refer to this style of working as a bottom-up, resource-based vulnerability perspective (Pielke et al. 2012). The emphasis is very much on understanding contextual vulnerability and the quest for effective risk communication and low-cost prevention strategies. Conversely, top-down assessments typically involve macro-scale modelling of physical systems, considering multiple scenarios and some (but never all) dimensions of uncertainty. These are not mutually exclusive paradigms – the latter often frames the risk and international dimensions within which the former local solutions must ultimately reside (Conway et al. 2019).
Peer-to-peer collaborations can be highly productive in terms of two-way knowledge exchange and personal development. However, international partnerships face a range of barriers including narrow funding rules set by national agencies, inconsistent review processes, restricted access to facilities and/or data, intellectual property rights, and issues around cyber-security (Suresh 2012). Ideally, research council policies and incentives would promote mobility and interconnectedness of (early career) researchers across borders. The benefits could be wider participation in a global enterprise of hydrological problem-solving, regardless of uneven patterns of national science funding or research capacity.
Finally, there really is no substitute for fieldwork whether the purpose is to better formulate a research question, observe hydrological phenomena, or to collect new data. Unfortunately, in an age of open data, we can all become divorced from the processes involved in the gathering and scrutiny of primary information. As data sets are amalgamated, assimilated and/or post-processed, any errors and biases may become harder to detect (Wilby et al. 2017). Others are concerned about the opportunity costs of collecting data – when this time and resource could be devoted to analysing ‘free’ data and writing papers. The ‘global perspectives’ or ‘global relevance’ emphasized by some top journals may further disincentivise the pursuit of local field studies in hydrology. Hence, there may be a tension between the needs of the individual researcher and those of the wider community for new observations (Allen & Berghuijs 2018). Likewise, capacity development and knowledge exchange activities can be highly resourced intensive for academic hydrologists, but such services are an essential part of building more informed communities of practice (Watts 2015). Without such attention, there is a danger that researchers in countries with less capacity will be ‘left behind’ as new global datasets, remote sensing products, and hyper-resolution models become increasingly mainstream (Conway, personal communication). In summary, developing research collaborations, gathering new data, and investing time in people are all ways in which we can contribute to the hydrological mission.
CONCLUSIONS
This Comment Paper calls for a hydrological research agenda that is focused on solving the mounting challenges of global water, food, and energy security. The priority areas inevitably reflect the views of the author, so some critical knowledge gaps are likely under-represented. For instance, more weight could be given to tackling the global pandemic of arsenic poisoning by naturally contaminated groundwater (Chakraborti et al. 2003); or to the manifold threats faced by freshwater biodiversity (Reid et al. 2019), especially in headwaters (Riley et al. 2018); or to the uncertain regional hydrological ramifications of deforestation/afforestation combined with CO2 fertilization (Prudhomme et al. 2014); or to stresses on transboundary water security (Siegfried et al. 2012), including the disruption of international flows of embodied water in commodities (Hunt et al. 2014). Some regard excessive nutrient enrichment and poor water quality as critical threats to ecosystem functioning (Woodward et al. 2012). Others are more intent on solving scientific problems in hydrology (e.g. Blöschl et al. 2019).
Additionally, there are the non-trivial technical challenges associated with the early detection and attribution of hydrological change. As well as reliable information about co-drivers of change, we also need conceptual and modelling frameworks that allow robust testing of multiple working hypotheses, such as the relative role play by climate, land use, and water management (e.g. Merz et al. 2012; Harrigan et al. 2014). Even then, statistically significant hydrological trends may not be detectable for decades (or even centuries to come) in environments with marked variability yet relatively weak signals of change (Ziegler et al. 2005; Wilby 2006). How might this be achieved in places with little or no data? How much change is practically (as opposed to statistically) significant? How does the likelihood of detection and attribution vary with catchment characteristics and/or choice of hydrological index? How should we respond when a credible signal emerges? Such cross-cutting questions warrant thematic programmes of inquiry in their own right.
The research agenda offered here is a multidisciplinary endeavour involving the expertise of hydrologists, climatologists, social scientists, ecologists, and many others, alongside stakeholders (Wheater & Gober 2015). There are also synergies with the science challenges identified by related communities (e.g. BGS 2019). Six priority themes were proposed here: (1) filling data sparse places; (2) predicting peak water; (3) understanding the physical drivers of mega droughts; (4) evaluating hyper-resolution models; (5) managing compound hazards; and (6) adjusting engineering standards (under non-stationary conditions). These will require the expansion of skills in hydro-informatics, data recovery, and visualization, while reinvigorating field observation, focusing on extreme hydrological events or hostile/remote locations.
We have unprecedented amounts of information and computing power at our disposal (Chen & Han 2016; McCabe et al. 2017). The Internet of Things offers scope for improved monitoring and modelling with adaptive management of water resources and hydrological hazards (e.g. Qiuming et al. 2012; Fang et al. 2015; Zhang et al. 2018). Such tools present exciting opportunities for fusing data streams from different sources and mining content for predictability. Yet, even in an era of ‘big data’, there are sensitive locations that remain data sparse. Many of these places are experiencing some of the most rapid hydrological change and/or are home to some of the most vulnerable people – priority environments are the global water towers (Immerzeel et al. 2010) and low-income communities of the global south (Douglas et al. 2008). Neither should we forget the deep uncertainties affecting our raw input data and hydrological models (Wilby et al. 2017; Ekström et al. 2018; Beven 2019a), nor the human dimensions to water system changes. With these elements in place, we will be better equipped to meet the unprecedented hydrological challenges of coming decades.
ACKNOWLEDGEMENTS
This commentary is dedicated to the memory of Geoff Petts. In our last correspondence, he invited me to share thoughts about directions for research in hydrology that address key global challenges. Hopefully, Geoff's care and foresight have been honoured. I am also indebted to Keith Beven, Declan Conway, Hayley Fowler, Harriet Orr, Murray Peel, and Howard Wheater for all their generous feedback on earlier drafts of this paper.