skip to main content

Search for: All records

Creators/Authors contains: "Haase, Jennifer"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available June 1, 2024
  2. Abstract. Current climate models have difficulty representing realistic wave–mean flow interactions, partly because the contribution from waves with fine vertical scales is poorly known. There are few direct observations of these waves, and most models have difficulty resolving them. This observational challenge cannot be addressed by satellite or sparse ground-based methods. The Strateole-2 long-duration stratospheric superpressure balloons that float with the horizontal wind on constant-density surfaces provide a unique platform for wave observations across a broad range of spatial and temporal scales. For the first time, balloon-borne Global Navigation Satellite System (GNSS) radio occultation (RO) is used to provide high-vertical-resolution equatorial wave observations. By tracking navigation signal refractive delays from GPS satellites near the horizon, 40–50 temperature profiles were retrieved daily, from balloon flight altitude (∼20 km) down to 6–8 km altitude, forming an orthogonal pattern of observations over a broad area (±400–500 km) surrounding the flight track. The refractivity profiles show an excellent agreement of better than 0.2 % with co-located radiosonde, spaceborne COSMIC-2 RO, and reanalysis products. The 200–500 m vertical resolution and the spatial and temporal continuity of sampling make it possible to extract properties of Kelvin waves and gravity waves with vertical wavelengths as short as 2–3 km. The results illustrate themore »difference in the Kelvin wave period (20 vs. 16 d) in the Lagrangian versus ground-fixed reference and as much as a 20 % difference in amplitude compared to COSMIC-2, both of which impact estimates of momentum flux. A small dataset from the extra Galileo, GLONASS, and BeiDou constellations demonstrates the feasibility of nearly doubling the sampling density in planned follow-on campaigns when data with full equatorial coverage will contribute to a better estimate of wave forcing on the quasi-biennial oscillation (QBO) and improved QBO representation in models.« less
  3. Rapidly growing societal needs in urban areas are increasing the demand for tall buildings with complex structural systems. Many of these buildings are located in areas characterized by high seismicity. Quantifying the seismic resilience of these buildings requires comprehensive fragility assessment that integrates iterative nonlinear dynamic analysis (NDA). Under these circumstances, traditional finite element (FE) analysis may become impractical due to its high computational cost. Soft-computing methods can be applied in the domain of NDA to reduce the computational cost of seismic fragility analysis. This study presents a framework that employs nonlinear autoregressive neural networks with exogenous input (NARX) in fragility analysis of multi-story buildings. The framework uses structural health monitoring data to calibrate a nonlinear FE model. The model is employed to generate the training dataset for NARX neural networks with ground acceleration and displacement time histories as the input and output of the network, respectively. The trained NARX networks are then used to perform incremental dynamic analysis (IDA) for a suite of ground motions. Fragility analysis is next conducted based on the results of the IDA obtained from the trained NARX network. The framework is illustrated on a twelve-story reinforced concrete building located at Oklahoma State University, Stillwatermore »campus.« less
  4. In the Stratéole 2 program, set to launch in November 2018, instruments will ride balloons into the stratosphere and circle the world, observing properties of the air and winds in fine detail.
  5. Abstract

    We face a new era in the assessment of multiple natural hazards whose statistics are becoming alarmingly non‐stationary due to ubiquitous long‐term changes in climate. One particular case is tsunami hazard affected by climate‐change‐driven sea level rise (SLR). A traditional tsunami hazard assessment approach where SLR is omitted or included as a constant sea‐level offset in a probabilistic calculation may misrepresent the impacts of climate‐change. In this paper, a general method called non‐stationary probabilistic tsunami hazard assessment (nPTHA), is developed to include the long‐term time‐varying changes in mean sea level. The nPTHA is based on a non‐stationary Poisson process model, which takes advantage of the independence of arrivals within non‐overlapping time‐intervals to specify a temporally varying hazard mean recurrence rate, affected by SLR. The nPTHA is applied to the South China Sea (SCS) for tsunamis generated by earthquakes in the Manila Subduction Zone. The method provides unique and comprehensive results for inundation hazard, combining tsunami and SLR at a specific location over a given exposure time. The results show that in the SCS, SLR has a significant impact when its amplitude is comparable to that of tsunamis with moderate probability of exceedance. The SLR and its associated uncertainty producemore »an impact on nPTHA results comparable to that caused by the uncertainty in the earthquake recurrence model. These findings are site‐specific and must be analyzed for different regions. The proposed methodology, however, is sufficiently general to include other non‐stationary phenomena and can be exploited for other hazards affected by SLR.

    « less
  6. Abstract

    Models of bathymetry derived from satellite radar altimetry are essential for modeling many marine processes. They are affected by uncertainties which require quantification. We propose an uncertainty model that assumes errors are caused by the lack of high‐wavenumber content within the altimetry data. The model is then applied to a tsunami hazard assessment. We build a bathymetry uncertainty model for northern Chile. Statistical properties of the altimetry‐predicted bathymetry error are obtained using multibeam data. We find that a Von Karman correlation function and a Laplacian marginal distribution can be used to define an uncertainty model based on a random field. We also propose a method for generating synthetic bathymetry samples conditional to shipboard measurements. The method is further extended to account for interpolation uncertainties, when bathymetry data resolution is finer than10 km. We illustrate the usefulness of the method by quantifying the bathymetry‐induced uncertainty of a tsunami hazard estimate. We demonstrate that tsunami leading wave predictions at middle/near field tide gauges and buoys are insensitive to bathymetry uncertainties in Chile. This result implies that tsunami early warning approaches can take full advantage of altimetry‐predicted bathymetry in numerical simulations. Finally, we evaluate the feasibility of modeling uncertainties in regions without multibeammore »data by assessing the bathymetry error statistics of 15 globally distributed regions. We find that a general Von Karman correlation and a Laplacian marginal distribution can serve as a first‐order approximation. The standard deviation of the uncertainty random field model varies regionally and is estimated from a proposed scaling law.

    « less
  7. Abstract

    The 2018 Palu tsunami contributed significantly to the devastation caused by the associated7.5 earthquake. This began a debate about how the moderate size earthquake triggered such a large tsunami within Palu Bay, with runups of more than 10 m. The possibility of a large component of vertical coseismic deformation and submarine landslides have been considered as potential explanations. However, scarce instrumental data have made it difficult to resolve the potential contributions from either type of source. We use tsunami waveforms derived from social media videos in Palu Bay to model the possible sources of the tsunami. We invert InSAR data with different fault geometries and use the resulting seafloor displacements to simulate tsunamis. The coseismic sources alone cannot match both the video‐derived time histories and surveyed runups. Then we conduct a tsunami source inversion using the video‐derived time histories and a tide gauge record as inputs. We specify hypothetical landslide locations and solve for initial tsunami elevation. Our results, validated with surveyed runups, show that a limited number of landslides in southern Palu Bay are sufficient to explain the tsunami data. The Palu tsunami highlights the difficulty in accurately capturing with tide gauges the amplitude and timing ofmore »short period waves that can have large impacts at the coast. The proximity of landslides to locations of high fault slip also suggests that tsunami hazard assessment in strike‐slip environments should include triggered landslides, especially for locations where the coastline morphology is strongly linked to fault geometry.

    « less