skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 8:00 PM ET on Friday, March 21 until 8:00 AM ET on Saturday, March 22 due to maintenance. We apologize for the inconvenience.


Title: Sub-Nyquist Sampling with Optical Pulses for Photonic Blind Source Separation

We proposed and demonstrated an optical pulse sampling method for photonic blind source separation. It can separate large bandwidth of mixed signals by small sampling frequency, which can reduce the workload of digital signal processing.

 
more » « less
Award ID(s):
2128616
PAR ID:
10437284
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Frontiers in Optics + Laser Science
Page Range / eLocation ID:
FTu6C.5
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Abstract

    Objective-driven adaptive sampling is a widely used tool for the optimization of deterministic black-box functions. However, the optimization of stochastic simulation models as found in the engineering, biological, and social sciences is still an elusive task. In this work, we propose a scalable adaptive batch sampling scheme for the optimization of stochastic simulation models with input-dependent noise. The developed algorithm has two primary advantages: (i) by recommending sampling batches, the designer can benefit from parallel computing capabilities, and (ii) by replicating of previously observed sampling locations the method can be scaled to higher-dimensional and more noisy functions. Replication improves numerical tractability as the computational cost of Bayesian optimization methods is known to grow cubicly with the number of unique sampling locations. Deciding when to replicate and when to explore depends on what alternative minimizes the posterior prediction accuracy at and around the spatial locations expected to contain the global optimum. The algorithm explores a new sampling location to reduce the interpolation uncertainty and replicates to improve the accuracy of the mean prediction at a single sampling location. Through the application of the proposed sampling scheme to two numerical test functions and one real engineering problem, we show that we can reliably and efficiently find the global optimum of stochastic simulation models with input-dependent noise.

     
    more » « less
  2. We present a new semiparametric extension of the Fay-Herriot model, termed the agnostic Fay-Herriot model (AGFH), in which the sampling-level model is expressed in terms of an unknown general function [Formula: see text]. Thus, the AGFH model can express any distribution in the sampling model since the choice of [Formula: see text] is extremely broad. We propose a Bayesian modelling scheme for AGFH where the unknown function [Formula: see text] is assigned a Gaussian Process prior. Using a Metropolis within Gibbs sampling Markov Chain Monte Carlo scheme, we study the performance of the AGFH model, along with that of a hierarchical Bayesian extension of the Fay-Herriot model. Our analysis shows that the AGFH is an excellent modelling alternative when the sampling distribution is non-Normal, especially in the case where the sampling distribution is bounded. It is also the best choice when the sampling variance is high. However, the hierarchical Bayesian framework and the traditional empirical Bayesian framework can be good modelling alternatives when the signal-to-noise ratio is high, and there are computational constraints.

    AMS subject classification: 62D05; 62F15

     
    more » « less
  3. We argue that one can associate a pseudo-time with sequences of configurations generated in the course of classical Monte Carlo simulations for a single-minimum bound state if the sampling is optimal. Hereby, the sampling rates can be, under special circumstances, calibrated against the relaxation rate and frequency of motion of an actual physical system. The latter possibility is linked to the optimal sampling regime being a universal crossover separating two distinct suboptimal sampling regimes analogous to the physical phenomena of diffusion and effusion, respectively. Bound states break symmetry; one may thus regard the pseudo-time as a quantity emerging together with the bound state. Conversely, when transport among distinct bound states takes place—thus restoring symmetry—a pseudo-time can no longer be defined. One can still quantify activation barriers if the latter barriers are smooth, but simulation becomes impractically slow and pertains to overdamped transport only. Specially designed Monte Carlo moves that bypass activation barriers—so as to accelerate sampling of the thermodynamics—amount to effusive transport and lead to severe under-sampling of transition-state configurations that separate distinct bound states while destroying the said universality. Implications of the present findings for simulations of glassy liquids are discussed.

     
    more » « less
  4. Abstract. High-quality long-term observational records are essential to ensure appropriate and reliable trend detection of tropospheric ozone. However, the necessity of maintaining high sampling frequency, in addition to continuity, is often under-appreciated. A common assumption is that, so long as long-term records (e.g., a span of a few decades) are available, (1) the estimated trends are accurate and precise, and (2) the impact of small-scale variability (e.g., weather) can be eliminated. In this study, we show that the undercoverage bias (e.g., a type of sampling error resulting from statistical inference based on sparse or insufficient samples, such as once-per-week sampling frequency) can persistently reduce the trend accuracy of free tropospheric ozone, even if multi-decadal time series are considered. We use over 40 years of nighttime ozone observations measured at Mauna Loa, Hawaii (representative of the lower free troposphere), to make this demonstration and quantify the bias in monthly means and trends under different sampling strategies. We also show that short-term meteorological variability remains a cause of an inflated long-term trend uncertainty. To improve the trend precision and accuracy due to sampling bias, two remedies are proposed: (1) a data variability attribution of colocated meteorological influence can efficiently reduce estimation uncertainty and moderately reduce the impact of sparse sampling, and (2) an adaptive sampling strategy based on anomaly detection enables us to greatly reduce the sampling bias and produce more accurate trends using fewer samples compared to an intense regular sampling strategy.

     
    more » « less
  5. Grueber, Catherine E (Ed.)
    Abstract

    Landscape genomics can harness environmental and genetic data to inform conservation decisions by providing essential insights into how landscapes shape biodiversity. The massive increase in genetic data afforded by the genomic era provides exceptional resolution for answering critical conservation genetics questions. The accessibility of genomic data for non‐model systems has also enabled a shift away from population‐based sampling to individual‐based sampling, which now provides accurate and robust estimates of genetic variation that can be used to examine the spatial structure of genomic diversity, population connectivity and the nature of environmental adaptation. Nevertheless, the adoption of individual‐based sampling in conservation genetics has been slowed due, in large part, to concerns over how to apply methods developed for population‐based sampling to individual‐based sampling schemes. Here, we discuss the benefits of individual‐based sampling for conservation and describe how landscape genomic methods, paired with individual‐based sampling, can answer fundamental conservation questions. We have curated key landscape genomic methods into a user‐friendly, open‐source workflow, which we provide as a new R package, A Landscape Genomics Analysis Toolkit in R (algatr). Thealgatrpackage includes novel added functionality for all of the included methods and extensive vignettes designed with the primary goal of making landscape genomic approaches more accessible and explicitly applicable to conservation biology.

     
    more » « less