skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Multiscale Estimation of Event Arrival Times and Their Uncertainties in Hydroacoustic Records from Autonomous Oceanic Floats
ABSTRACT We describe an algorithm to pick event onsets in noisy records, characterize their error distributions, and derive confidence intervals on their timing. Our method is based on an Akaike information criterion that identifies the partition of a time series into a noise and a signal segment that maximizes the signal-to-noise ratio. The distinctive feature of our approach lies in the timing uncertainty analysis, and in its application in the time domain and in the wavelet timescale domain. Our novel data are records collected by freely floating Mobile Earthquake Recording in Marine Areas by Independent Divers (MERMAID) instruments, midcolumn hydrophones that report triggered segments of ocean-acoustic time series.  more » « less
Award ID(s):
1917058
PAR ID:
10148312
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Bulletin of the Seismological Society of America
ISSN:
0037-1106
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    The main question to address in this paper is to recommend optimal signal timing plans in real time under incidents by incorporating domain knowledge developed with the traffic signal timing plans tuned for possible incidents, and learning from historical data of both traffic and implemented signals timing. The effectiveness of traffic incident management is often limited by the late response time and excessive workload of traffic operators. This paper proposes a novel decision-making framework that learns from both data and domain knowledge to real-time recommend contingency signal plans that accommodate non-recurrent traffic, with the outputs from real-time traffic prediction at least 30 min in advance. Specifically, considering the rare occurrences of engagement of contingency signal plans for incidents, it is proposed to decompose the end-to-end recommendation task into two hierarchical models—real-time traffic prediction and plan association. The connections between the two models are learnt through metric learning, which reinforces partial-order preferences observed from historical signal engagement records. The effectiveness of this approach is demonstrated by testing this framework on the traffic network in Cranberry Township, Pennsylvania, U.S., in 2019. Results show that the recommendation system has a precision score of 96.75% and recall of 87.5% on the testing plan, and makes recommendations an average of 22.5 min lead time ahead of Waze alerts. The results suggest that this framework is capable of giving traffic operators a significant time window to access the conditions and respond appropriately. 
    more » « less
  2. Abstract Paleoclimate records can be considered low‐dimensional projections of the climate system that generated them. Understanding what these projections tell us about past climates, and changes in their dynamics, is a main goal of time series analysis on such records. Laplacian eigenmaps of recurrence matrices (LERM) is a novel technique using univariate paleoclimate time series data to indicate when notable shifts in dynamics have occurred. LERM leverages time delay embedding to construct a manifold that is mappable to the attractor of the climate system; this manifold can then be analyzed for significant dynamical transitions. Through numerical experiments with observed and synthetic data, LERM is applied to detect both gradual and abrupt regime transitions. Our paragon for gradual transitions is the Mid‐Pleistocene Transition (MPT). We show that LERM can robustly detect gradual MPT‐like transitions for sufficiently high signal‐to‐noise (S/N) ratios, though with a time lag related to the embedding process. Our paragon of abrupt transitions is the “8.2 ka” event; we find that LERM is generally robust at detecting 8.2 ka‐like transitions for sufficiently high S/N ratios, though edge effects become more influential. We conclude that LERM can usefully detect dynamical transitions in paleogeoscientific time series, with the caveat that false positive rates are high when dynamical transitions are not present, suggesting the importance of using multiple records to confirm the robustness of transitions. We share an open‐source Python package to facilitate the use of LERM in paleoclimatology and paleoceanography. 
    more » « less
  3. Single-molecule and related experiments yield time series of an observable as it fluctuates due to thermal motion. In such data, it can be difficult to distinguish fluctuating signal from fluctuating noise. We present a method of separating signal from noise using nonlinear-correlation functions. The method is fully nonparametric: No a priori model for the system is required, no knowledge of whether the system is continuous or discrete is needed, the number of states is not fixed, and the system can be Markovian or not. The noise-corrected, nonlinear-correlation functions can be converted to the system’s Green’s function; the noise-corrected moments yield the system’s equilibrium-probability distribution. As a demonstration, we analyze synthetic data from a three-state system. The correlation method is compared to another fully nonparametric approach—time binning to remove noise, and histogramming to obtain the distribution. The correlation method has substantially better resolution in time and in state space. We develop formulas for the limits on data quality needed for signal recovery from time series and test them on datasets of varying size and signal-to-noise ratio. The formulas show that the signal-to-noise ratio needs to be on the order of or greater than one-half before convergence scales at a practical rate. With experimental benchmark data, the positions and populations of the states and their exchange rates are recovered with an accuracy similar to parametric methods. The methods demonstrated here are essential components in building a complete analysis of time series using only high-order correlation functions. 
    more » « less
  4. Abstract Hundreds of millions of supermassive black hole binaries are expected to contribute to the gravitational-wave signal in the nanohertz frequency band. Their signal is often approximated either as an isotropic Gaussian stochastic background with a power-law spectrum or as an individual source corresponding to the brightest binary. In reality, the signal is best described as a combination of a stochastic background and a few of the brightest binaries modeled individually. We present a method that uses this approach to efficiently create realistic pulsar timing array data sets using synthetic catalogs of binaries based on the Illustris cosmological hydrodynamic simulation. We explore three different properties of such realistic backgrounds that could help distinguish them from those formed in the early universe: (i) their characteristic strain spectrum, (ii) their statistical isotropy, and (iii) the variance of their spatial correlations. We also investigate how the presence of confusion noise from a stochastic background affects detection prospects of individual binaries. We calculate signal-to-noise ratios of the brightest binaries in different realizations for a simulated pulsar timing array based on the NANOGrav 12.5 yr data set extended to a time span of 15 yr. We find that ∼6% of the realizations produce systems with signal-to-noise ratios larger than 5, suggesting that individual systems might soon be detected (the fraction increases to ∼41% at 20 yr). These can be taken as a pessimistic prediction for the upcoming NANOGrav 15 yr data set, since it does not include the effect of potentially improved timing solutions and newly added pulsars. 
    more » « less
  5. Abstract Decades of observations show that the world's oceans have been losing oxygen, with far‐reaching consequences for ecosystems and biogeochemical cycling. To reconstruct oxygenation beyond the limited scope of instrumental records, proxy records are needed, such as sedimentary δ15N. We combine two δ15N records from the Santa Barbara Basin (SBB), a 24‐year‐long, biweekly sediment trap time series, and a 114‐year, high‐resolution sediment core together spanning the years 1892–2017. These records allow for the examination of δ15N variability on seasonal to centennial timescales. Seasonal variability in SBB δ15N is consistent in timing with the poleward advection of a high δ15N signal from the Eastern Tropical North Pacific in the summer and fall. Strong El Niño events result in variable δ15N signatures, reflective of local rainfall, and neither the Pacific Decadal Oscillation nor North Pacific Gyre Oscillation impose strong controls on bulk sedimentary δ15N. Seasonal and interannual variability in sediment trap δ13Corgis consistent with local productivity as a driver; however, this signal is not retained in the sediment core. The time series from the sediment trap and core show that bulk sedimentary δ15N in SBB has now exceeded that measured for the past 2,000 years. We hypothesize that the change in δ15N reflects the increasing influence of denitrified waters from the Eastern Tropical North Pacific and ongoing deoxygenation of the Eastern Pacific. When juxtaposed with other regional δ15N records our results further suggest that SBB is uniquely situated to record long‐term change in the Eastern Tropical North Pacific. 
    more » « less