Abstract. Deep learning (DL) rainfall–runoff models outperform conceptual, process-based models in a range of applications. However, it remains unclear whether DL models can produce physically plausible projections of streamflow under climate change. We investigate this question through a sensitivity analysis of modeled responses to increases in temperature and potential evapotranspiration (PET), with other meteorological variables left unchanged. Previous research has shown that temperature-based PET methods overestimate evaporative water loss under warming compared with energy budget-based PET methods. We therefore assume that reliable streamflow responses to warming should exhibit less evaporative water loss when forced with smaller, energy-budget-based PET compared with temperature-based PET. We conduct this assessment using three conceptual, process-based rainfall–runoff models and three DL models, trained and tested across 212 watersheds in the Great Lakes basin. The DL models include a Long Short-Term Memory network (LSTM), a mass-conserving LSTM (MC-LSTM), and a novel variant of the MC-LSTM that also respects the relationship between PET and evaporative water loss (MC-LSTM-PET). After validating models against historical streamflow and actual evapotranspiration, we force all models with scenarios of warming, historical precipitation, and both temperature-based (Hamon) and energy-budget-based (Priestley–Taylor) PET, and compare their responses in long-term mean daily flow, low flows, high flows, and seasonal streamflow timing. We also explore similar responses using a national LSTM fit to 531 watersheds across the United States to assess how the inclusion of a larger and more diverse set of basins influences signals of hydrological response under warming. The main results of this study are as follows: The three Great Lakes DL models substantially outperform all process-based models in streamflow estimation. The MC-LSTM-PET also matches the best process-based models and outperforms the MC-LSTM in estimating actual evapotranspiration. All process-based models show a downward shift in long-term mean daily flows under warming, but median shifts are considerably larger under temperature-based PET (−17 % to −25 %) than energy-budget-based PET (−6 % to −9 %). The MC-LSTM-PET model exhibits similar differences in water loss across the different PET forcings. Conversely, the LSTM exhibits unrealistically large water losses under warming using Priestley–Taylor PET (−20 %), while the MC-LSTM is relatively insensitive to the PET method. DL models exhibit smaller changes in high flows and seasonal timing of flows as compared with the process-based models, while DL estimates of low flows are within the range estimated by the process-based models. Like the Great Lakes LSTM, the national LSTM also shows unrealistically large water losses under warming (−25 %), but it is more stable when many inputs are changed under warming and better aligns with process-based model responses for seasonal timing of flows. Ultimately, the results of this sensitivity analysis suggest that physical considerations regarding model architecture and input variables may be necessary to promote the physical realism of deep-learning-based hydrological projections under climate change.
more »
« less
Assessing the Physical Realism of Deep Learning Hydrologic Model Projections Under Climate Change
Abstract This study examines whether deep learning models can produce reliable future projections of streamflow under warming. We train a regional long short‐term memory network (LSTM) to daily streamflow in 15 watersheds in California and develop three process models (HYMOD, SAC‐SMA, and VIC) as benchmarks. We force all models with scenarios of warming and assess their hydrologic response, including shifts in the hydrograph and total runoff ratio. All process models show a shift to more winter runoff, reduced summer runoff, and a decline in the runoff ratio due to increased evapotranspiration. The LSTM predicts similar hydrograph shifts but in some watersheds predicts an unrealistic increase in the runoff ratio. We then test two alternative versions of the LSTM in which process model outputs are used as either additional training targets (i.e., multi‐output LSTM) or input features. Results indicate that the multi‐output LSTM does not correct the unrealistic streamflow projections under warming. The hybrid LSTM using estimates of evapotranspiration from SAC‐SMA as an additional input feature produces more realistic streamflow projections, but this does not hold for VIC or HYMOD. This suggests that the hybrid method depends on the fidelity of the process model. Finally, we test climate change responses under an LSTM trained to over 500 watersheds across the United States and find more realistic streamflow projections under warming. Ultimately, this work suggests that hybrid modeling may support the use of LSTMs for hydrologic projections under climate change, but so may training LSTMs to a large, diverse set of watersheds.
more »
« less
- Award ID(s):
- 2040613
- PAR ID:
- 10470165
- Publisher / Repository:
- Water Resources Research
- Date Published:
- Journal Name:
- Water Resources Research
- Volume:
- 58
- Issue:
- 9
- ISSN:
- 0043-1397
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract. As a genre of physics-informed machine learning, differentiable process-based hydrologic models (abbreviated as δ or delta models) with regionalized deep-network-based parameterization pipelines were recently shown to provide daily streamflow prediction performance closely approaching that of state-of-the-art long short-term memory (LSTM) deep networks. Meanwhile, δ models provide a full suite of diagnostic physical variables and guaranteed mass conservation. Here, we ran experiments to test (1) their ability to extrapolate to regions far from streamflow gauges and (2) their ability to make credible predictions of long-term (decadal-scale) change trends. We evaluated the models based on daily hydrograph metrics (Nash–Sutcliffe model efficiency coefficient, etc.) and predicted decadal streamflow trends. For prediction in ungauged basins (PUB; randomly sampled ungauged basins representing spatial interpolation), δ models either approached or surpassed the performance of LSTM in daily hydrograph metrics, depending on the meteorological forcing data used. They presented a comparable trend performance to LSTM for annual mean flow and high flow but worse trends for low flow. For prediction in ungauged regions (PUR; regional holdout test representing spatial extrapolation in a highly data-sparse scenario), δ models surpassed LSTM in daily hydrograph metrics, and their advantages in mean and high flow trends became prominent. In addition, an untrained variable, evapotranspiration, retained good seasonality even for extrapolated cases. The δ models' deep-network-based parameterization pipeline produced parameter fields that maintain remarkably stable spatial patterns even in highly data-scarce scenarios, which explains their robustness. Combined with their interpretability and ability to assimilate multi-source observations, the δ models are strong candidates for regional and global-scale hydrologic simulations and climate change impact assessment.more » « less
-
Abstract Predictions of hydrologic variables across the entire water cycle have significant value for water resources management as well as downstream applications such as ecosystem and water quality modeling. Recently, purely data‐driven deep learning models like long short‐term memory (LSTM) showed seemingly insurmountable performance in modeling rainfall runoff and other geoscientific variables, yet they cannot predict untrained physical variables and remain challenging to interpret. Here, we show that differentiable, learnable, process‐based models (calledδmodels here) can approach the performance level of LSTM for the intensively observed variable (streamflow) with regionalized parameterization. We use a simple hydrologic model HBV as the backbone and use embedded neural networks, which can only be trained in a differentiable programming framework, to parameterize, enhance, or replace the process‐based model's modules. Without using an ensemble or post‐processor,δmodels can obtain a median Nash‐Sutcliffe efficiency of 0.732 for 671 basins across the USA for the Daymet forcing data set, compared to 0.748 from a state‐of‐the‐art LSTM model with the same setup. For another forcing data set, the difference is even smaller: 0.715 versus 0.722. Meanwhile, the resulting learnable process‐based models can output a full set of untrained variables, for example, soil and groundwater storage, snowpack, evapotranspiration, and baseflow, and can later be constrained by their observations. Both simulated evapotranspiration and fraction of discharge from baseflow agreed decently with alternative estimates. The general framework can work with models with various process complexity and opens up the path for learning physics from big data.more » « less
-
Increases in evapotranspiration (ET) from global warming are decreasing streamflow in headwater basins worldwide. However, these streamflow losses do not occur uniformly due to complex topography. To better understand the heterogeneity of streamflow loss, we use the Budyko shape parameter (ω) as a diagnostic tool. We fit ω to 37-year of hydrologic simulation output in the Upper Colorado River Basin (UCRB), an important headwater basin in the US. We split the UCRB into two categories: peak watersheds with high elevation and steep slopes, and valley watersheds with lower elevation and gradual slopes. Our results demonstrate a relationship between streamflow loss and ω. The valley watersheds with greater streamflow loss have ω higher than 3.1, while the peak watersheds with less streamflow loss have an average ω of 1.3. This work highlights the use of ω as an indicator of streamflow loss and could be generalized to other headwater basin systems.more » « less
-
Abstract Predicting future streamflow change is essential for water resources management and understanding the impacts of projected climate and land use changes on water availability. The Budyko framework is a useful and computationally efficient tool to model streamflow at larger spatial scales. This study predicts future streamflow changes in 889 watersheds in the contiguous United States based on projected climate and land use changes from 2040 to 2069. The temporal variability of surface water balance controls, represented by the Budykoωparameter, was modeled using multiple linear regression, random forest (RF), and gradient boosting. Results show that RF is the optimal model and can explain >85% of the variance in most watersheds. Relative cumulative moisture surplus, forest coverage, crop land and urban land are the most important variables of the time‐varyingωin most watersheds. There are statistically significant increases in mean annual precipitation, potential evapotranspiration, andωin 2040–2069, as compared to 1950–2005. This leads to a statistically significant decrease in the runoff ratio (Q/P). Streamflow is projected to decrease in the central, southwestern, and southeastern United States and increase in the northeast. These projections of water availability which are based on future climate and land use change scenarios can inform water resources management and adaptation strategies.more » « less
An official website of the United States government

