Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
ABSTRACT Variants of the Epidemic‐Type Aftershock Sequence (ETAS) and Short‐Term Earthquake Probabilities (STEP) models have been used for earthquake forecasting and are entered as forecast models in the purely prospective Collaboratory Study for Earthquake Predictability (CSEP) experiment. Previous analyses have suggested the ETAS model offered the best forecast skill for the first several years of CSEP. Here, we evaluate the prospective forecasting ability of the ETAS and STEP one‐day forecast models for California from 2013 to 2017, using super‐thinned residuals and Voronoi residuals. We find very comparable performance of the two models, with slightly superior performance of the STEP model compared to ETAS according to most metrics.more » « less
-
Abstract The southern San Andreas fault is in its interseismic period, occasionally releasing some stored elastic strain during triggered slow slip events (SSEs) at <2.5 km depth. A distinct, shallowly exhumed gouge defines the fault and is present at SSE depths. To evaluate if this material can host SSEs, we characterize its mineralogy, microstructures, and frictional behavior with water‐saturated deformation experiments near‐in situ conditions, and we compare laboratory healing rates to natural SSEs. Our results show that slip localizes along clay surfaces in both laboratory and natural settings. The gouge is weak (coefficient of friction of ∼0.29), exhibits low healing rates (<0.001/decade), and transitions from unstable to stable behavior at slip rates above ∼1 μm/s. Healing rate and friction drop data from laboratory instabilities are comparable to geodetically‐constrained values for SSEs. Collective observations indicate this gouge could host shallow SSEs and/or localize slip facilitating dynamic rupture propagation to the surface.more » « less
-
Abstract Seismograms from two borehole seismometers near the 2019 Ridgecrest, California, aftershock sequence do not return to pre-mainshock noise levels for over ten days after the M 7.1 Ridgecrest mainshock. The observed distribution of root mean square amplitudes in these records can be explained with the Reasenberg and Jones (1989) aftershock occurrence model, which implies a continuous seismic “hum” of overlapping aftershocks of M > −2 occurring at an average rate of 10 events per second after ten days, which prevents observing the background aseismic noise level at times between the body-wave arrivals from cataloged and other clearly observed events. Even after the borehole noise levels return at their quietest times to pre-mainshock conditions, the presence of overlapping low-magnitude earthquakes for 80 days is implied by waveform cross-correlation results provided using the matrix profile method. These results suggest a hidden frontier of tiny earthquakes that potentially can be measured and characterized even in the absence of detection and location of individual events.more » « less
-
Abstract With the rise of data volume and computing power, seismological research requires more advanced skills in data processing, numerical methods, and parallel computing. We present the experience of conducting training workshops in various forms of delivery to support the adoption of large-scale high-performance computing (HPC) and cloud computing, advancing seismological research. The seismological foci were on earthquake source parameter estimation in catalogs, forward and adjoint wavefield simulations in 2D and 3D at local, regional, and global scales, earthquake dynamics, ambient noise seismology, and machine learning. This contribution describes the series of workshops delivered as part of research projects, the learning outcomes for participants, and lessons learned by the instructors. Our curriculum was grounded on open and reproducible science, large-scale scientific computing and data mining, and computing infrastructure (access and usage) for HPC and the cloud. We also describe the types of teaching materials that have proven beneficial to the instruction and the sustainability of the program. We propose guidelines to deliver future workshops on these topics.more » « less
-
ABSTRACT We measure maximum amplitudes in the time domain on recordings of the 2019 Ridgecrest earthquake sequence to convert ground-motion amplitudes to source spectra. To do this, we modify Richter’s local magnitude relation to measure frequency-dependent empirical amplitude-decay curves and station corrections for a series of narrowband time-domain filters. Peak displacement amplitude in each frequency band is used to construct the displacement spectrum. After correction for attenuation, we determine corner frequency and moment from the resulting source spectra. By this approach, we measure moment magnitudes reliably to as small as ML 1.0. We find stress drop increases with both depth and magnitude and discuss whether this could be an artifact through assumptions about the source, path, and site.more » « less
-
Abstract The San Fernando Valley (SFV), a densely populated region in Southern California, has high earthquake hazard due to a complex network of active faults and the amplifying effects of the sedimentary basin. Since the devastating 1994 Mw 6.7 Northridge earthquake, numerous studies have examined its structure using various geological and geophysical datasets. However, current seismic velocity models still lack the resolution to accurately image the near-surface velocity structure and concealed or blind faults, which are critical for high-frequency wavefield simulations and earthquake hazard modeling. To address these challenges, we develop a 3D high-resolution shear-wave velocity model for the SFV using ambient noise data from a dense array of 140 seismic nodes and 10 Southern California Seismic Network stations. We also invert gravity data to map the basin geometry and integrate horizontal-to-vertical spectral ratios and aeromagnetic data to constrain interfaces and map major geological structures. With a lateral resolution of 250 m near the basin center, our model reveals previously unresolved geological features, including the detailed geometry of the basin and previously unmapped structure of faults at depth. The basin deepens from the Santa Monica Mountains in the south to approximately 4 km near its center and 7 km in the Sylmar sub-basin at the basin’s northern margin. Strong velocity contrasts are observed across major faults, at the basin edges, and in the basin’s upper 500 m, for which we measure velocities as low as 200 m/s. Our high-resolution model will enhance ground-motion simulations and earthquake hazard assessments for the SFV and has implications for other urban areas with high seismic risk.more » « less
-
ABSTRACT Stress drop is a fundamental parameter related to earthquake source physics, but is hard to measure accurately. To better understand how different factors influence stress-drop measurements, we compare two different methods using the Ridgecrest stress-drop validation data set: spectral decomposition (SD) and spectral ratio (SR), each with different processing options. We also examine the influence of spectral complexity on source parameter measurement. Applying the SD method, we find that frequency bandwidth and time-window length could influence spectral magnitude calibration, while depth-dependent attenuation is important to correctly map stress-drop variations. For the SR method, we find that the selected source model has limited influence on the measurements; however, the Boatwright model tends to produce smaller standard deviation and larger magnitude dependence than the Brune model. Variance reduction threshold, frequency bandwidth, and time-window length, if chosen within an appropriate parameter range, have limited influence on source parameter measurement. For both methods, wave type, attenuation correction, and spectral complexity strongly influence the result. The scale factor that quantifies the magnitude dependence of stress drop show large variations with different processing options, and earthquakes with complex source spectra deviating from the Brune-type source models tend to have larger scale factor than earthquakes without complexity. Based on these detailed comparisons, we make a few specific suggestions for data processing workflows that could help future studies of source parameters and interpretations.more » « less
-
ABSTRACT The recorded seismic waveform is a convolution of event source term, path term, and station term. Removing high-frequency attenuation due to path effect is a challenging problem. Empirical Green’s function (EGF) method uses nearly collocated small earthquakes to correct the path and station terms for larger events recorded at the same station. However, this method is subject to variability due to many factors. We focus on three events that were well recorded by the seismic network and a rapid response distributed acoustic sensing (DAS) array. Using a suite of high-quality EGF events, we assess the influence of time window, spectral measurement options, and types of data on the spectral ratio and relative source time function (RSTF) results. Increased number of tapers (from 2 to 16) tends to increase the measured corner frequency and reduce the source complexity. Extended long time window (e.g., 30 s) tends to produce larger variability of corner frequency. The multitaper algorithm that simultaneously optimizes both target and EGF spectra produces the most stable corner-frequency measurements. The stacked spectral ratio and RSTF from the DAS array are more stable than two nearby seismic stations, and are comparable to stacked results from the seismic network, suggesting that DAS array has strong potential in source characterization.more » « less
-
Abstract Rock friction tests have made profound contributions to our understanding of earthquake processes. Most rock friction tests focused on fault strength evolution during velocity steps or at specific slip rates and the characteristics during stick‐slip events such as dynamic rupture propagation and the transition from stable sliding to instability, with little attention paid to the transient acceleration and deceleration periods. Here, we present Westerly Granite fault friction test results using a unique pneumatically powered apparatus with high acceleration of up to 50 g, focusing on the transient stages of fast fault acceleration and deceleration during both high‐speed sliding and stick‐slip events. Our data demonstrates the dominating velocity‐weakening behavior at transient stages of fault acceleration and deceleration, with a 1/V dependence for peak friction and deceleration lobe consistent with the flash‐heating model but with the acceleration lobe consistently deviating from the 1/V dependence. Our analysis of velocity‐dependent friction between dynamic rupture events, stick‐slips, and high‐speed friction tests reveals the significance of high acceleration in influencing transient fault weakening during dynamic weakening. We further demonstrate that the deviation of the friction‐velocity curve from the 1/V trend during fault acceleration is associated with the contribution of the dynamic rupturing process during the initiation of fault slip.more » « less
-
Abstract Numerical simulations of Sequences of Earthquakes and Aseismic Slip (SEAS) have rapidly progressed to address fundamental problems in fault mechanics and provide self‐consistent, physics‐based frameworks to interpret and predict geophysical observations across spatial and temporal scales. To advance SEAS simulations with rigor and reproducibility, we pursue community efforts to verify numerical codes in an expanding suite of benchmarks. Here we present code comparison results from a new set of quasi‐dynamic benchmark problems BP6‐QD‐A/S/C that consider an aseismic slip transient induced by changes in pore fluid pressure consistent with fluid injection and diffusion in fault models with different treatments of fault friction. Ten modeling groups participated in problems BP6‐QD‐A and BP6‐QD‐S considering rate‐and‐state fault models using the aging (‐A) and slip (‐S) law formulations for frictional state evolution, respectively, allowing us to better understand how various computational factors across codes affect the simulated evolution of pore pressure and aseismic slip. Comparisons of problems using the aging versus slip law, and a constant friction coefficient (‐C), illustrate how aseismic slip models can differ in the timing and amount of slip achieved with different treatments of fault friction given the same perturbations in pore fluid pressure. We achieve excellent quantitative agreement across participating codes, with further agreement attained by ensuring sufficiently fine time‐stepping and consistent treatment of boundary conditions. Our benchmark efforts offer a community‐based example to reveal sensitivities of numerical modeling results, which is essential for advancing multi‐physics SEAS models to better understand and construct reliable predictive models of fault dynamics.more » « less
An official website of the United States government
