skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 5:00 PM ET until 11:00 PM ET on Friday, June 21 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Tsai, Victor C."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. SUMMARY

    Ambient noise tomography is a well-established tomographic imaging technique but the effect that spatially variable noise sources have on the measurements remains challenging to account for. Full waveform ambient noise inversion has emerged recently as a promising solution but is computationally challenging since even distant noise sources can have an influence on the interstation correlation functions and therefore requires a prohibitively large numerical domain, beyond that of the tomographic region of interest. We investigate a new strategy that allows us to reduce the simulation domain while still being able to account for distant contributions. To allow nearby numerical sources to account for distant true sources, we introduce correlated sources and generate a time-dependent effective source distribution at the boundary of a small region of interest that excites the correlation wavefield of a larger domain. In a series of 2-D numerical simulations, we demonstrate that the proposed methodology with correlated sources is able to successfully represent a far-field source that is simultaneously present with nearby sources and the methodology also successfully results in a robustly estimated noise source distribution. Furthermore, we show how beamforming results can be used as prior information regarding the azimuthal variation of the ambient noise sources in helping determine the far-field noise distribution. These experiments provide insight into how to reduce the computational cost needed to perform full waveform ambient noise inversion, which is key to turning it into a viable tomographic technique. In addition, the presented experiments may help reduce source-induced bias in time-dependent monitoring applications.

     
    more » « less
  2. Abstract

    Numerical modeling of ice sheet motion and hence projections of global sea level rise require information about the evolving subglacial environment, which unfortunately remains largely unknown due to its difficulty of access. Here we advance such subglacial observations by reporting multi‐year observations of seismic tremor likely associated with glacier sliding at Helheim Glacier. This association is confirmed by correlation analysis between tremor power and multiple environmental forcings on different timescales. Variations of the observed tremor power indicate that different factors affect glacial sliding on different timescales. Effective pressure may control glacial sliding on long (seasonal/annual) timescales, while tidal forcing modulates the sliding rate and tremor power on short (hourly/daily) timescales. Polarization results suggest that the tremor source comes from an upstream subglacial ridge. This observation provides insights on how different factors should be included in ice sheet modeling and how their timescales of variability play an essential role.

     
    more » « less
    Free, publicly-accessible full text available January 16, 2025
  3. Abstract

    Accurate precipitation monitoring is crucial for understanding climate change and rainfall-driven hazards at a local scale. However, the current suite of monitoring approaches, including weather radar and rain gauges, have different insufficiencies such as low spatial and temporal resolution and difficulty in accurately detecting potentially destructive precipitation events such as hailstorms. In this study, we develop an array-based method to monitor rainfall with seismic nodal stations, offering both high spatial and temporal resolution. We analyze seismic records from 1825 densely spaced, high-frequency seismometers in Oklahoma, and identify signals from nine precipitation events that occurred during the one-month station deployment in 2016. After removing anthropogenic noise and Earth structure response, the obtained precipitation spatial pattern mimics the one from a nearby operational weather radar, while offering higher spatial (~ 300 m) and temporal (< 10 s) resolution. We further show the potential of this approach to monitor hail with joint analysis of seismic intensity and independent precipitation rate measurements, and advocate for coordinated seismological-meteorological field campaign design.

     
    more » « less
  4. SUMMARY

    Seismic tomography is a cornerstone of geophysics and has led to a number of important discoveries about the interior of the Earth. However, seismic tomography remains plagued by the large number of unknown parameters in most tomographic applications. This leads to the inverse problem being underdetermined and requiring significant non-geologically motivated smoothing in order to achieve unique answers. Although this solution is acceptable when using tomography as an explorative tool in discovery mode, it presents a significant problem to use of tomography in distinguishing between acceptable geological models or in estimating geologically relevant parameters since typically none of the geological models considered are fit by the tomographic results, even when uncertainties are accounted for. To address this challenge, when seismic tomography is to be used for geological model selection or parameter estimation purposes, we advocate that the tomography can be explicitly parametrized in terms of the geological models being tested instead of using more mathematically convenient formulations like voxels, splines or spherical harmonics. Our proposition has a number of technical difficulties associated with it, with some of the most important ones being the move from a linear to a non-linear inverse problem, the need to choose a geological parametrization that fits each specific problem and is commensurate with the expected data quality and structure, and the need to use a supporting framework to identify which model is preferred by the tomographic data. In this contribution, we introduce geological parametrization of tomography with a few simple synthetic examples applied to imaging sedimentary basins and subduction zones, and one real-world example of inferring basin and crustal properties across the continental United States. We explain the challenges in moving towards more realistic examples, and discuss the main technical difficulties and how they may be overcome. Although it may take a number of years for the scientific program suggested here to reach maturity, it is necessary to take steps in this direction if seismic tomography is to develop from a tool for discovering plausible structures to one in which distinct scientific inferences can be made regarding the presence or absence of structures and their physical characteristics.

     
    more » « less
  5. Abstract

    The spatial patterns of earthquake ground motion amplitudes are commonly represented using a double‐couple model that corresponds to shear slip on a planar fault. While this framework has proven largely successful in explaining low‐frequency seismic recordings, at higher frequencies the wavefield becomes more azimuthally isotropic for reasons that are not yet well understood. Here, we use a dense array of nodal seismometers in Oklahoma to study the radiation patterns of earthquakes in the near‐source region where the effects of wavefield scattering are limited. At these close distances, the radiation pattern is predominantly double couple at low frequencies (<15 Hz). At higher frequencies, the recorded wavefield contains significant isotropic and residual components that cannot be explained as path or site effects, implying complexity in the rupture process or local fault zone structure. These findings demonstrate that earthquake source complexity can drive variability in the ground motions that control seismic hazard.

     
    more » « less
  6. Abstract

    The proliferation of dense arrays promises to improve our ability to image geological structures at the scales necessary for accurate assessment of seismic hazard. However, combining the resulting local high‐resolution tomography with existing regional models presents an ongoing challenge. We developed a framework based on the level‐set method that infers where local data provide meaningful constraints beyond those found in regional models ‐ for example the Community Velocity Models (CVMs) of southern California. This technique defines a volume within which updates are made to a reference CVM, with the boundary of the volume being part of the inversion rather than explicitly defined. By penalizing the complexity of the boundary, a minimal update that sufficiently explains the data is achieved. To test this framework, we use data from the Community Seismic Network, a dense permanent urban deployment. We inverted Love wave dispersion and amplification data, from the Mw 6.4 and 7.1 2019 Ridgecrest earthquakes. We invert for an update to CVM‐S4.26 using the Tikhonov Ensemble Sampling scheme, a highly efficient derivative‐free approximate Bayesian method. We find the data are best explained by a deepening of the Los Angeles Basin with its deepest part south of downtown Los Angeles, along with a steeper northeastern basin wall. This result offers new progress toward the parsimonious incorporation of detailed local basin models within regional reference models utilizing an objective framework and highlights the importance of accurate basin models when accounting for the amplification of surface waves in the high‐rise building response band.

     
    more » « less
  7. Abstract

    Fault complexity has been linked to high‐frequency earthquake radiation, although the underlying physical mechanisms are not well understood. Fault complexity is commonly modeled with rough single faults; however, real‐world faults are additionally complex, existing within networks of other faults. In this study, we introduce two new ways of defining fault complexity using mapped fault traces, characterizing fault networks in terms of their degree of alignment and density. We find that both misalignment and density correlate with enhanced high‐frequency seismic radiation across Southern California, with misalignment showing a stronger correlation. This robust correlation suggests that high‐frequency radiation may arise in part from fault‐fault interactions within networks of misaligned faults. Fault‐fault interactions may therefore have important consequences for earthquake rupture dynamics, energetics and earthquake hazards and should not be overlooked.

     
    more » « less
  8. Abstract

    Earthquakes occur within complex fault zones containing numerous intersecting fault strands. This complexity poses a computational challenge for rupture models, which typically simplify fault structure to a small number of rough fault surfaces, with all other deformation assumed to be off‐fault viscoplastic deformation. In such models, high‐frequency ground motions originate solely from frictionally mediated, heterogeneous slip on a small number of potentially rough fault surfaces or from off‐fault viscoplastic deformation. Alternative explanations for high‐frequency ground motion generation that can account for a larger number of fault surfaces remain difficult to assess. Here, we evaluate the efficacy of a recently proposed stochastic impact model in which high‐frequency ground motion is caused by elastic impacts of structures within a complex fault zone. Impacts are envisioned to occur in response to fault motion in the presence of geometrical incompatibilities, which promotes transfer of slip onto different fault strands on timescales mediated by elasticity. We investigate the role of a complex fault zone for high‐frequency ground motion by comparing the underlying assumptions and resulting predictions of impact and rough fault frictional models. Relative to rough fault frictional models, impact models are characterized by deformation timescales and corner frequencies that are set by elasticity rather than viscoplasticity, relatively angular rather than smoothly varying fault roughness geometries, high‐frequency radiation patterns that are more isotropic, and higherP/Sradiated energies. We outline ways to discriminate whether impact or rough fault frictional models are more likely to explain observations of high‐frequency ground motions.

     
    more » « less