Streamflow forecasting in snowmelt-dominated basins is essential for water resource planning, flood mitigation, and ecological sustainability. This study presents a comparative evaluation of statistical, machine learning (Random Forest), and deep learning models (Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Spatio-Temporal Graph Neural Network (STGNN)) using 30 years of data from 20 monitoring stations across the Upper Colorado River Basin (UCRB). We assess the impact of integrating meteorological variables—particularly, the Snow Water Equivalent (SWE)—and spatial dependencies on predictive performance. Among all models, the Spatio-Temporal Graph Neural Network (STGNN) achieved the highest accuracy, with a Nash–Sutcliffe Efficiency (NSE) of 0.84 and Kling–Gupta Efficiency (KGE) of 0.84 in the multivariate setting at the critical downstream node, Lees Ferry. Compared to the univariate setup, SWE-enhanced predictions reduced Root Mean Square Error (RMSE) by 12.8%. Seasonal and spatial analyses showed the greatest improvements at high-elevation and mid-network stations, where snowmelt dynamics dominate runoff. These findings demonstrate that spatio-temporal learning frameworks, especially STGNNs, provide a scalable and physically consistent approach to streamflow forecasting under variable climatic conditions.
more »
« less
Spatio-Temporal Graph Neural Networks for Streamflow Prediction in the Upper Colorado Basin
Streamflow prediction is vital for effective water resource management, enabling a better understanding of hydrological variability and its response to environmental factors. This study presents a spatio-temporal graph neural network (STGNN) model for streamflow prediction in the Upper Colorado River Basin (UCRB), integrating graph convolutional networks (GCNs) to model spatial connectivity and long short-term memory (LSTM) networks to capture temporal dynamics. Using 30 years of monthly streamflow data from 20 monitoring stations, the STGNN predicted streamflow over a 36-month horizon and was evaluated against traditional models, including random forest regression (RFR), LSTM, gated recurrent units (GRU), and seasonal auto-regressive integrated moving average (SARIMA). The STGNN outperformed these models across multiple metrics, achieving an R2 of 0.78, an RMSE of 0.81 mm/month, and a KGE of 0.79 at critical locations like Lees Ferry. A sequential analysis of input–output configurations identified the (36, 36) setup as optimal for balancing historical context and forecasting accuracy. Additionally, the STGNN showed strong generalizability when applied to other locations within the UCRB. These results underscore the importance of integrating spatial dependencies and temporal dynamics in hydrological forecasting, offering a scalable and adaptable framework to improve predictive accuracy and support adaptive water resource management in river basins.
more »
« less
- PAR ID:
- 10673922
- Publisher / Repository:
- MDPI
- Date Published:
- Journal Name:
- Hydrology
- Volume:
- 12
- Issue:
- 3
- ISSN:
- 2306-5338
- Page Range / eLocation ID:
- 60
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Streamflow prediction plays a vital role in water resources planning in order to understand the dramatic change of climatic and hydrologic variables over different time scales. In this study, we used machine learning (ML)-based prediction models, including Random Forest Regression (RFR), Long Short-Term Memory (LSTM), Seasonal Auto- Regressive Integrated Moving Average (SARIMA), and Facebook Prophet (PROPHET) to predict 24 months ahead of natural streamflow at the Lees Ferry site located at the bottom part of the Upper Colorado River Basin (UCRB) of the US. Firstly, we used only historic streamflow data to predict 24 months ahead. Secondly, we considered meteorological components such as temperature and precipitation as additional features. We tested the models on a monthly test dataset spanning 6 years, where 24-month predictions were repeated 50 times to ensure the consistency of the results. Moreover, we performed a sensitivity analysis to identify our best-performing model. Later, we analyzed the effects of considering different span window sizes on the quality of predictions made by our best model. Finally, we applied our best-performing model, RFR, on two more rivers in different states in the UCRB to test the model’s generalizability. We evaluated the performance of the predictive models using multiple evaluation measures. The predictions in multivariate time-series models were found to be more accurate, with RMSE less than 0.84 mm per month, R-squared more than 0.8, and MAPE less than 0.25. Therefore, we conclude that the temperature and precipitation of the UCRB increases the accuracy of the predictions. Ultimately, we found that multivariate RFR performs the best among four models and is generalizable to other rivers in the UCRB.more » « less
-
This paper advances machine learning (ML)-based streamflow prediction by strategically selecting rainfall events, introducing a new loss function, and addressing rainfall forecast uncertainties. Focusing on the Iowa River Basin, we applied the stochastic storm transposition (SST) method to create realistic rainfall events, which were input into a hydrological model to generate corresponding streamflow data for training and testing deterministic and probabilistic ML models. Long short-term memory (LSTM) networks were employed to predict streamflow up to 12 h ahead. An active learning approach was used to identify the most informative rainfall events, reducing data generation effort. Additionally, we introduced a novel asymmetric peak loss function to improve peak streamflow prediction accuracy. Incorporating rainfall forecast uncertainties, our probabilistic LSTM model provided uncertainty quantification for streamflow predictions. Performance evaluation using different metrics improved the accuracy and reliability of our models. These contributions enhance flood forecasting and decision-making while significantly reducing computational time and costs.more » « less
-
Karst aquifers are important groundwater resources that supply drinking water for approximately 25 % of the world’s population. Their complex hydrogeological structures, dual-flow regimes, and highly heterogeneous flow pose significant challenges for accurate hydrodynamic modeling and sustainable management. Traditional modeling approaches often struggle to capture the intricate spatial dependencies and multi-scale temporal patterns inherent in karst systems, particularly the interactions between rapid conduit flow and slower matrix flow. This study proposes a novel multi-scale dynamic graph attention network integrated with long short-term memory model (GAT-LSTM) to innovatively learn and integrate spatial and temporal dependencies in karst systems for forecasting spring discharge. The model introduces several innovative components: (1) graph-based neural networks with dynamic edge-weighting mechanism are proposed to learn and update spatial dependencies based on both geographic distances and learned hydrological relationships, (2) a multi-head attention mechanism is adopted to capture different aspects of spatial relationships simultaneously, and (3) a hierarchical temporal architecture is incorporated to process hydrological temporal patterns at both monthly and seasonal scales with an adaptive fusion mechanism for final results. These features enable the proposed model to effectively account for the dual-flow dynamics in karst systems, where rapid conduit flow and slower matrix flow coexist. The newly proposed model is applied to the Barton Springs of the Edwards Aquifer in Texas. The results demonstrate that it can obtain more accurate and robust prediction performance across various time steps compared to traditional temporal and spatial deep learning approaches. Based on the multi-scale GAT-LSTM model, a comprehensive ablation analysis and permutation feature important are conducted to analyze the relative contribution of various input variables on the final prediction. These findings highlight the intricate nature of karst systems and demonstrate that effective spring discharge prediction requires comprehensive monitoring networks encompassing both primary recharge contributors and supplementary hydrological features that may serve as valuable indicators of system-wide conditions.more » « less
-
Demeniconi, Carlotta; Davidson, Ian (Ed.)This paper proposes a physics-guided machine learning approach that combines machine learning models and physics-based models to improve the prediction of water flow and temperature in river networks. We first build a recurrent graph network model to capture the interactions among multiple segments in the river network. Then we transfer knowledge from physics-based models to guide the learning of the machine learning model. We also propose a new loss function that balances the performance over different river segments. We demonstrate the effectiveness of the proposed method in predicting temperature and streamflow in a subset of the Delaware River Basin. In particular, the proposed method has brought a 33%/14% accuracy improvement over the state-of-the-art physics-based model and 24%/14% over traditional machine learning models (e.g., LSTM) in temperature/streamflow prediction using very sparse (0.1%) training data. The proposed method has also been shown to produce better performance when generalized to different seasons or river segments with different streamflow ranges.more » « less
An official website of the United States government

