skip to main content


Search for: All records

Award ID contains: 1751216

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract The shear-wave velocity time averaged over the upper 30 m (VS30) is widely used as a proxy for site effects, forms the basis of seismic site class, and underpins site-amplification factors in empirical ground-motion models. Many earthquake simulations, therefore, require VS30. This presents a challenge at regional scale, given the infeasibility of subsurface testing over vast areas. Although various models for predicting VS30 have thus been proposed, the most popular U.S. national, or “background,” model is a regression equation based on just one variable. Given the growth of community data sets, satellite remote sensing, and algorithmic learning, more advanced and accurate solutions may be possible. Toward that end, we develop national VS30 models and maps using field data from over 7000 sites and machine learning (ML), wherein up to 17 geospatial parameters are used to predict subsurface conditions (i.e., VS30). Of the two models developed, that using geologic data performs marginally better, yet such data are not always available. Both models significantly outperform existing solutions in unbiased testing and are used to create new VS30 maps at ∼220 m resolution. These maps are updated in the vicinity of field measurements using regression kriging and cover the 50 U.S. states and Puerto Rico. Ultimately, and like any model, performance cannot be known where data is sparse. In this regard, alternative maps that use other models are proposed for steep slopes. More broadly, this study demonstrates the utility of ML for inferring below-ground conditions from geospatial data, a technique that could be applied to other data and objectives. 
    more » « less
  2. While soil liquefaction is common in earthquakes, the case-history data required to train and test state-of-practice prediction models remains comparatively scarce, owing to the breadth and expense of data that comprise a single case history. The 2001 Nisqually, Washington, earthquake, for example, occurred in a metropolitan region and induced damaging liquefaction in the urban cores of Seattle and Olympia, yet case-history data have not previously been published. Accordingly, this article compiles 24 cone-penetration-test (CPT) case histories from free-field locations. The many methods used to obtain and process the data are detailed herein, as is the structure of the digital data set. The case histories are then analyzed by 18 existing liquefaction response models to determine whether any is better, and to compare model performance in Nisqually against global observations. While differences are measured, both between models and against prior global case histories, these differences are often statistically insignificant considering finite-sample uncertainty. This alludes to the general inappropriateness of championing models based on individual earthquakes or otherwise small data sets, and to the ongoing needs for additional case-history data and more rigorous adherence to best practices in model training and testing. 
    more » « less
  3. In regions of infrequent moderate-to-large earthquakes, historic earthquake catalogs are often insufficient to provide inputs to seismic-hazard analyses (i.e. fault locations and magnitude–frequency relations) or to inform ground-motion predictions for certain seismic sources. In these regions, analysis of relic coseismic evidence, such as paleoliquefaction, is commonly used to infer information about the seismic hazard. However, while paleoliquefaction studies have been performed widely, all existing analysis techniques require a priori assumptions about the causative earthquake’s location (i.e. rupture magnitude and ground motions can otherwise not be estimated). This may lead to inaccurate assumptions in some settings, and by corollary, erroneous results. Accordingly, this article proposes an inversion framework to probabilistically constrain seismic-source parameters from paleoliquefaction. Analyzing evidence at regional scale leads to (a) a geospatial likelihood surface that constrains the rupture location and (b) a probability distribution of the rupture magnitude, wherein source-location uncertainty is explicitly considered. Simulated paleoliquefaction studies are performed on earthquakes with known parameters. These examples demonstrate the framework’s potential, even in cases of limited field evidence, as well as important caveats and lessons for forward use. The proposed framework has the potential to provide new insights in enigmatic seismic zones worldwide.

     
    more » « less