Abstract Near‐term, iterative ecological forecasts can be used to help understand and proactively manage ecosystems. To date, more forecasts have been developed for aquatic ecosystems than other ecosystems worldwide, likely motivated by the pressing need to conserve these essential and threatened ecosystems and increasing the availability of high‐frequency data. Forecasters have implemented many different modeling approaches to forecast freshwater variables, which have demonstrated promise at individual sites. However, a comprehensive analysis of the performance of varying forecast models across multiple sites is needed to understand broader controls on forecast performance. Forecasting challenges (i.e., community‐scale efforts to generate forecasts while also developing shared software, training materials, and best practices) present a useful platform for bridging this gap to evaluate how a range of modeling methods perform across axes of space, time, and ecological systems. Here, we analyzed forecasts from the aquatics theme of the National Ecological Observatory Network (NEON) Forecasting Challenge hosted by the Ecological Forecasting Initiative. Over 100,000 probabilistic forecasts of water temperature and dissolved oxygen concentration for 1–30 days ahead across seven NEON‐monitored lakes were submitted in 2023. We assessed how forecast performance varied among models with different structures, covariates, and sources of uncertainty relative to baseline null models. A similar proportion of forecast models were skillful across both variables (34%–40%), although more individual models outperformed the baseline models in forecasting water temperature (10 models out of 29) than dissolved oxygen (6 models out of 15). These top performing models came from a range of classes and structures. For water temperature, we found that forecast skill degraded with increases in forecast horizons, process‐based models, and models that included air temperature as a covariate generally exhibited the highest forecast performance, and that the most skillful forecasts often accounted for more sources of uncertainty than the lower performing models. The most skillful forecasts were for sites where observations were most divergent from historical conditions (resulting in poor baseline model performance). Overall, the NEON Forecasting Challenge provides an exciting opportunity for a model intercomparison to learn about the relative strengths of a diverse suite of models and advance our understanding of freshwater ecosystem predictability.
more »
« less
This content will become publicly available on April 24, 2026
Evaluation of ETAS and STEP Forecasting Models for California Seismicity Using Point Process Residuals
ABSTRACT Variants of the Epidemic‐Type Aftershock Sequence (ETAS) and Short‐Term Earthquake Probabilities (STEP) models have been used for earthquake forecasting and are entered as forecast models in the purely prospective Collaboratory Study for Earthquake Predictability (CSEP) experiment. Previous analyses have suggested the ETAS model offered the best forecast skill for the first several years of CSEP. Here, we evaluate the prospective forecasting ability of the ETAS and STEP one‐day forecast models for California from 2013 to 2017, using super‐thinned residuals and Voronoi residuals. We find very comparable performance of the two models, with slightly superior performance of the STEP model compared to ETAS according to most metrics.
more »
« less
- Award ID(s):
- 2225216
- PAR ID:
- 10591933
- Publisher / Repository:
- Wiley Blackwell (John Wiley & Sons)
- Date Published:
- Journal Name:
- Environmetrics
- Volume:
- 36
- Issue:
- 4
- ISSN:
- 1180-4009
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The clustering of earthquake magnitudes is poorly understood compared to spatial and temporal clustering. Better understanding of correlations between earthquake magnitudes could provide insight into the mechanisms of earthquake rupture and fault interactions, and improve earthquake forecasting models. In this study we present a novel method of examining how seismic magnitude clustering occurs beyond the next event in the catalog and evolves with time and space between earthquake events. We first evaluate the clustering signature over time and space using double-difference located catalogs from Southern and Northern California. The strength of magnitude clustering appears to decay linearly with distance between events and logarithmically with time. The signature persists for longer distances (more than 50km) and times (several days) than previously thought, indicating that magnitude clustering is not driven solely by repeated rupture of an identical fault patch or Omori aftershock processes. The decay patterns occur in all magnitude ranges of the catalog and are demonstrated across multiple methodologies of study. These patterns are also shown to be present in laboratory rock fracture catalogs but absent in ETAS synthetic catalogs. Incorporating magnitude clustering decay patterns into earthquake forecasting models such as ETAS could improve their accuracy.more » « less
-
Abstract Seismology is witnessing explosive growth in the diversity and scale of earthquake catalogs. A key motivation for this community effort is that more data should translate into better earthquake forecasts. Such improvements are yet to be seen. Here, we introduce the Recurrent Earthquake foreCAST (RECAST), a deep‐learning model based on recent developments in neural temporal point processes. The model enables access to a greater volume and diversity of earthquake observations, overcoming the theoretical and computational limitations of traditional approaches. We benchmark against a temporal Epidemic Type Aftershock Sequence model. Tests on synthetic data suggest that with a modest‐sized data set, RECAST accurately models earthquake‐like point processes directly from cataloged data. Tests on earthquake catalogs in Southern California indicate improved fit and forecast accuracy compared to our benchmark when the training set is sufficiently long (>104events). The basic components in RECAST add flexibility and scalability for earthquake forecasting without sacrificing performance.more » « less
-
Abstract The development of new earthquake forecasting models is often motivated by one of the following complementary goals: to gain new insights into the governing physics and to produce improved forecasts quantified by objective metrics. Often, one comes at the cost of the other. Here, we propose a question-driven ensemble (QDE) modeling approach to address both goals. We first describe flexible epidemic-type aftershock sequence (ETAS) models in which we relax the assumptions of parametrically defined aftershock productivity and background earthquake rates during model calibration. Instead, both productivity and background rates are calibrated with data such that their variability is optimally represented by the model. Then we consider 64 QDE models in pseudoprospective forecasting experiments for southern California and Italy. QDE models are constructed by combining model parameters of different ingredient models, in which the rules for how to combine parameters are defined by questions about the future seismicity. The QDE models can be interpreted as models that address different questions with different ingredient models. We find that certain models best address the same issues in both regions, and that QDE models can substantially outperform the standard ETAS and all ingredient models. The best performing QDE model is obtained through the combination of models allowing flexible background seismicity and flexible aftershock productivity, respectively, in which the former parameterizes the spatial distribution of background earthquakes and the partitioning of seismicity into background events and aftershocks, and the latter is used to parameterize the spatiotemporal occurrence of aftershocks.more » « less
-
Abstract Reservoir operations for gas extraction, fluid disposal, carbon dioxide storage, or geothermal energy production are capable of inducing seismicity. Modeling tools exist for seismicity forecasting using operational data, but the computational costs and uncertainty quantification (UQ) pose challenges. We address this issue in the context of seismicity induced by gas production from the Groningen gas field using an integrated modeling framework, which combines reservoir modeling, geomechanical modeling, and stress-based earthquake forecasting. The framework is computationally efficient thanks to a 2D finite-element reservoir model, which assumes vertical flow equilibrium, and the use of semianalytical solutions to calculate poroelastic stress changes and predict seismicity rate. The earthquake nucleation model is based on rate-and-state friction and allows for an initial strength excess so that the faults are not assumed initially critically stressed. We estimate uncertainties in the predicted number of earthquakes and magnitudes. To reduce the computational costs, we assume that the stress model is true, but our UQ algorithm is general enough that the uncertainties in reservoir and stress models could be incorporated. We explore how the selection of either a Poisson or a Gaussian likelihood influences the forecast. We also use a synthetic catalog to estimate the improved forecasting performance that would have resulted from a better seismicity detection threshold. Finally, we use tapered and nontapered Gutenberg–Richter distributions to evaluate the most probable maximum magnitude over time and account for uncertainties in its estimation. Although we did not formally account for uncertainties in the stress model, we tested several alternative stress models, and found negligible impact on the predicted temporal evolution of seismicity and forecast uncertainties. Our study shows that the proposed approach yields realistic estimates of the uncertainties of temporal seismicity and is applicable for operational forecasting or induced seismicity monitoring. It can also be used in probabilistic traffic light systems.more » « less
An official website of the United States government
