skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Perspectives on Clustering and Declustering of Earthquakes
Abstract Clustering is a fundamental feature of earthquakes that impacts basic and applied analyses of seismicity. Events included in the existing short-duration instrumental catalogs are concentrated strongly within a very small fraction of the space–time volume, which is highly amplified by activity associated with the largest recorded events. The earthquakes that are included in instrumental catalogs are unlikely to be fully representative of the long-term behavior of regional seismicity. We illustrate this and other aspects of space–time earthquake clustering, and propose a quantitative clustering measure based on the receiver operating characteristic diagram. The proposed approach allows eliminating effects of marginal space and time inhomogeneities related to the geometry of the fault network and regionwide changes in earthquake rates, and quantifying coupled space–time variations that include aftershocks, swarms, and other forms of clusters. The proposed measure is used to quantify and compare earthquake clustering in southern California, western United States, central and eastern United States, Alaska, Japan, and epidemic-type aftershock sequence model results. All examined cases show a high degree of coupled space–time clustering, with the marginal space clustering dominating the marginal time clustering. Declustering earthquake catalogs can help clarify long-term aspects of regional seismicity and increase the signal-to-noise ratio of effects that are subtler than the strong clustering signatures. We illustrate how the high coupled space–time clustering can be decreased or removed using a data-adaptive parsimonious nearest-neighbor declustering approach, and emphasize basic unresolved issues on the proper outcome and quality metrics of declustering. At present, declustering remains an exploratory tool, rather than a rigorous optimization problem, and selecting an appropriate declustering method should depend on the data and problem at hand.  more » « less
Award ID(s):
1723033 1722561 2122168
PAR ID:
10338873
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Seismological Research Letters
Volume:
93
Issue:
1
ISSN:
0895-0695
Page Range / eLocation ID:
386 to 401
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract We introduce an algorithm for declustering earthquake catalogs based on the nearest‐neighbor analysis of seismicity. The algorithm discriminates between background and clustered events by random thinning that removes events according to a space‐varying threshold. The threshold is estimated using randomized‐reshuffled catalogs that are stationary, have independent space and time components, and preserve the space distribution of the original catalog. Analysis of catalog produced by the Epidemic Type Aftershock Sequence model demonstrates that the algorithm correctly classifies over 80% of background and clustered events, correctly reconstructs the stationary and space‐dependent background intensity, and shows high stability with respect to random realizations (over 75% of events have the same estimated type in over 90% of random realizations). The declustering algorithm is applied to the global Northern California Earthquake Data Center catalog with magnitudesm≥ 4 during 2000–2015; a Southern California catalog withm≥ 2.5, 3.5 during 1981–2017; an area around the 1992 Landers rupture zone withm≥ 0.0 during 1981–2015; and the Parkfield segment of San Andreas fault withm≥ 1.0 during 1984–2014. The null hypotheses of stationarity and space‐time independence are not rejected by several tests applied to the estimated background events of the global and Southern California catalogs with magnitude ranges Δm< 4. However, both hypotheses are rejected for catalogs with larger range of magnitudes Δm> 4. The deviations from the nulls are mainly due to local temporal fluctuations of seismicity and activity switching among subregions; they can be traced back to the original catalogs and represent genuine features of background seismicity. 
    more » « less
  2. Abstract Earthquakes are clustered in space and time, with individual sequences composed of events linked by stress transfer and triggering mechanisms. On a global scale, variations in the productivity of earthquake sequences—a normalized measure of the number of triggered events—have been observed and associated with regional variations in tectonic setting. Here, we focus on resolving systematic variations in the productivity of crustal earthquake sequences in California and Nevada—the two most seismically active states in the western United States. We apply a well-tested nearest-neighbor algorithm to automatically extract earthquake sequence statistics from a unified 40 yr compilation of regional earthquake catalogs that is complete to M ∼ 2.5. We then compare earthquake sequence productivity to geophysical parameters that may influence earthquake processes, including heat flow, temperature at seismogenic depth, complexity of quaternary faulting, geodetic strain rates, depth to crystalline basement, and faulting style. We observe coherent spatial variations in sequence productivity, with higher values in the Walker Lane of eastern California and Nevada than along the San Andreas fault system in western California. The results illuminate significant correlations between productivity and heat flow, temperature, and faulting that contribute to the understanding and ability to forecast crustal earthquake sequences in the area. 
    more » « less
  3. Abstract Quantifying the size of earthquakes is a foundational task in seismology, and over the years several magnitude scales have been developed. Of these, only scales based on seismic moment or potency can properly characterize changes in event size without saturation. Here, we develop empirical potency–magnitude scaling relations for earthquakes in the western United States, allowing us to translate instrumental magnitude estimates into uniform measures of earthquake size. We use synthetic waveforms to validate the observed scaling relations and to provide additional insight into the differences between instrumental and physics-based magnitude scales. Each earthquake in our catalog is assigned a clustering designation distinguishing mainshocks from triggered seismicity, along with a potency-based magnitude estimate that is comparable to moment magnitude and that can be easily converted into other magnitude scales as needed. The developed catalog and associated scaling relations have broad applications for fundamental and applied studies of earthquake processes and hazards. 
    more » « less
  4. null (Ed.)
    Exploring the spatiotemporal distribution of earthquake activity, especially earthquake migration of fault systems, can greatly to understand the basic mechanics of earthquakes and the assessment of earthquake risk. By establishing a three-dimensional strike-slip fault model, to derive the stress response and fault slip along the fault under regional stress conditions. Our study helps to create a long-term, complete earthquake catalog. We modelled Long-Short Term Memory (LSTM) networks for pattern recognition of the synthetical earthquake catalog. The performance of the models was compared using the mean-square error (MSE). Our results showed clearly the application of LSTM showed a meaningful result of 0.08% in the MSE values. Our best model can predict the time and magnitude of the earthquakes with a magnitude greater than Mw = 6.5 with a similar clustering period. These results showed conclusively that applying LSTM in a spatiotemporal series prediction provides a potential application in the study of earthquake mechanics and forecasting of major earthquake events. 
    more » « less
  5. Abstract In areas of induced seismicity, earthquakes can be triggered by stress changes due to fluid injection and static deformation from fault slip. Here we present a method to distinguish between injection‐driven and earthquake‐driven triggering of induced seismicity by combining a calibrated, fully coupled, poroelastic stress model of wastewater injection with interpretation of a machine learning algorithm trained on both earthquake catalog and modeled stress features. We investigate seismicity from Paradox Valley, Colorado as an ideal test case: a single, high‐pressure injector that has induced thousands of earthquakes since 1991. Using feature importance analysis, we find that injection‐driven earthquakes are approximately 225% of the total catalog but act as background events that can trigger subsequent aftershocks. Injection‐driven events also have distinct spatiotemporal clustering properties with a larger b‐value, closer proximity to the well, and earlier occurrence in the injection history. Generalization of our technique can help characterize triggering processes in other regions where induced seismicity occurs. 
    more » « less