skip to main content


Search for: limits to the precision

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. In various scenarios from system login to writing emails, documents, and forms, keyboard inputs carry alluring data such as passwords, addresses, and IDs. Due to commonly existing non-alphabetic inputs, punctuation, and typos, users' natural inputs rarely contain only constrained, purely alphabetic keys/words. This work studies how to reveal unconstrained keyboard inputs using auditory interfaces. Audio interfaces are not intended to have the capability of light sensors such as cameras to identify compactly located keys. Our analysis shows that effectively distinguishing the keys can require a fine localization precision level of keystroke sounds close to the range of microseconds. This work (1) explores the limits of audio interfaces to distinguish keystrokes, (2) proposes a μs-level customized signal processing and analysis-based keystroke tracking approach that takes into account the mechanical physics and imperfect measuring of keystroke sounds, (3) develops the first acoustic side-channel attack study on unconstrained keyboard inputs that are not purely alphabetic keys/words and do not necessarily follow known sequences in a given dictionary or training dataset, and (4) reveals the threats of non-line-of-sight keystroke sound tracking. Our results indicate that, without relying on vision sensors, attacks using limited-resolution audio interfaces can reveal unconstrained inputs from the keyboard with a fairly sharp and bendable "auditory eyesight." 
    more » « less
    Free, publicly-accessible full text available August 9, 2024
  2. The optical lever is a centuries old and widely used detection technique employed in applications ranging from consumer products and industrial sensors to precision force microscopes used in scientific research. However, despite the long history, its quantum limits have yet to be explored. In general, any precision optical measurement is accompanied by optical force induced disturbance to the measured object (termed as back action) leading to a standard quantum limit (SQL). Here, we give a simple ray optics description of how such back action can be evaded in optical lever detection. We perform a proof-of-principle experiment demonstrating the mechanism of back action evasion in the classical regime, by developing a lens system that cancels extra tilting of the reflected light off a silicon nitride membrane mechanical resonator caused by laser-pointing-noise-induced optical torques. We achieve a readout noise floor two orders of magnitude lower than the SQL, corresponding to an effective optomechanical cooperativity of 100 without the need for an optical cavity. As the state-of-the-art ultralow dissipation optomechanical systems relevant for quantum sensing are rapidly approaching the level where quantum noise dominates, simple and widely applicable back action evading protocols will be crucial for pushing beyond quantum limits.

     
    more » « less
  3. Abstract

    A generic search is presented for the associated production of a Z boson or a photon with an additional unspecified massive particle X,$${\textrm{pp}}\rightarrow {\textrm{pp}} +{{\textrm{Z}}}/\upgamma +{{\textrm{X}}} $$pppp+Z/γ+X, in proton-tagged events from proton–proton collisions at$$\sqrt{s}=13\, \textrm{TeV}$$s=13TeV, recorded in 2017 with the CMS detector and the CMS-TOTEM precision proton spectrometer. The missing mass spectrum is analysed in the 600–1600 GeV range and a fit is performed to search for possible deviations from the background expectation. No significant excess in data with respect to the background predictions has been observed. Model-independent upper limits on the visible production cross section of$${\textrm{pp}}\rightarrow {\textrm{pp}} +{{\textrm{Z}}}/\upgamma +{{\textrm{X}}} $$pppp+Z/γ+Xare set.

     
    more » « less
    Free, publicly-accessible full text available September 1, 2024
  4. Abstract

    Analysis of phylogenetic trees has become an essential tool in epidemiology. Likelihood-based methods fit models to phylogenies to draw inferences about the phylodynamics and history of viral transmission. However, these methods are often computationally expensive, which limits the complexity and realism of phylodynamic models and makes them ill-suited for informing policy decisions in real-time during rapidly developing outbreaks. Likelihood-free methods using deep learning are pushing the boundaries of inference beyond these constraints. In this paper, we extend, compare, and contrast a recently developed deep learning method for likelihood-free inference from trees. We trained multiple deep neural networks using phylogenies from simulated outbreaks that spread among 5 locations and found they achieve close to the same levels of accuracy as Bayesian inference under the true simulation model. We compared robustness to model misspecification of a trained neural network to that of a Bayesian method. We found that both models had comparable performance, converging on similar biases. We also implemented a method of uncertainty quantification called conformalized quantile regression that we demonstrate has similar patterns of sensitivity to model misspecification as Bayesian highest posterior density (HPD) and greatly overlap with HPDs, but have lower precision (more conservative). Finally, we trained and tested a neural network against phylogeographic data from a recent study of the SARS-Cov-2 pandemic in Europe and obtained similar estimates of region-specific epidemiological parameters and the location of the common ancestor in Europe. Along with being as accurate and robust as likelihood-based methods, our trained neural networks are on average over 3 orders of magnitude faster after training. Our results support the notion that neural networks can be trained with simulated data to accurately mimic the good and bad statistical properties of the likelihood functions of generative phylogenetic models.

     
    more » « less
  5. Replicating human-like dexterity in robot hands represents one of the largest open problems in robotics. Reinforcement learning is a promising approach that has achieved impressive progress in the last few years; however, the class of problems it has typically addressed corresponds to a rather narrow definition of dexterity as compared to human capabilities. To address this gap, we investigate piano-playing, a skill that challenges even the human limits of dexterity, as a means to test high-dimensional control, and which requires high spatial and temporal precision, and complex finger coordination and planning. We introduce RoboPianist, a system that enables simulated anthropomorphic hands to learn an extensive repertoire of 150 piano pieces where traditional model-based optimization struggles. We additionally introduce an open-sourced environment, benchmark of tasks, interpretable evaluation metrics, and open challenges for future study. 
    more » « less
    Free, publicly-accessible full text available November 6, 2024
  6. ABSTRACT

    FRB 20210912A is a fast radio burst (FRB), detected and localized to subarcsecond precision by the Australian Square Kilometre Array Pathfinder. No host galaxy has been identified for this burst despite the high precision of its localization and deep optical and infrared follow-up, to 5σ limits of R = 26.7 mag and Ks = 24.9 mag with the Very Large Telescope. The combination of precise radio localization and deep optical imaging has almost always resulted in the secure identification of a host galaxy, and this is the first case in which the line of sight is not obscured by the Galactic disc. The dispersion measure of this burst, DMFRB = 1233.696 ± 0.006 pc cm−3, allows for a large source redshift of z > 1 according to the Macquart relation. It could thus be that the host galaxy is consistent with the known population of FRB hosts, but is too distant to detect in our observations (z > 0.7 for a host like that of the first repeating FRB source, FRB 20121102A); that it is more nearby with a significant excess in DMhost, and thus dimmer than any known FRB host; or, least likely, that the FRB is truly hostless. We consider each possibility, making use of the population of known FRB hosts to frame each scenario. The fact of the missing host has ramifications for the FRB field: even with high-precision localization and deep follow-up, some FRB hosts may be difficult to detect, with more distant hosts being the less likely to be found. This has implications for FRB cosmology, in which high-redshift detections are valuable.

     
    more » « less
  7. Abstract. Volatile organic compounds (VOCs) contribute to air pollution both directly, as hazardous gases, and through their reactionswith common atmospheric oxidants to produce ozone, particulate matter, andother hazardous air pollutants. There are enormous ranges of structures andreaction rates among VOCs, and there is consequently a need to accuratelycharacterize the spatial and temporal distribution of individual identifiedcompounds. Current VOC measurements are often made with complex, expensiveinstrumentation that provides high chemical detail but is limited in itsportability and requires high expense (e.g., mobile labs) for spatiallyresolved measurements. Alternatively, periodic collection of samples oncartridges is inexpensive but demands significant operator interaction thatcan limit possibilities for time-resolved measurements or distributedmeasurements across a spatial area. Thus, there is a need for simple,portable devices that can sample with limited operator presence to enabletemporally and/or spatially resolved measurements. In this work, we describenew portable and programmable VOC samplers that enable simultaneouscollection of samples across a spatially distributed network, validate theirreproducibility, and demonstrate their utility. Validation experimentsconfirmed high precision between samplers as well as the ability ofminiature ozone scrubbers to preserve reactive analytes collected oncommercially available adsorbent gas sampling cartridges, supportingsimultaneous field deployment across multiple locations. In indoorenvironments, 24 h integrated samples demonstrate observable day-to-dayvariability, as well as variability across very short spatial scales(meters). The utility of the samplers was further demonstrated by locatingoutdoor point sources of analytes through the development of a new mappingapproach that employs a group of the portable samplers and back-projectiontechniques to assess a sampling area with higher resolution than stationarysampling. As with all gas sampling, the limits of detection depend onsampling times and the properties of sorbents and analytes. The limit of detectionof the analytical system used in this work is on the order of nanograms,corresponding to mixing ratios of 1–10 pptv after 1 h of sampling atthe programmable flow rate of 50–250 sccm enabled by the developed system.The portable VOC samplers described and validated here provide a simple,low-cost sampling solution for spatially and/or temporally variablemeasurements of any organic gases that are collectable on currentlyavailable sampling media.

     
    more » « less
    Free, publicly-accessible full text available October 13, 2024
  8. Abstract

    Understanding cellular responses to genetic perturbation is central to numerous biomedical applications, from identifying genetic interactions involved in cancer to developing methods for regenerative medicine. However, the combinatorial explosion in the number of possible multigene perturbations severely limits experimental interrogation. Here, we present graph-enhanced gene activation and repression simulator (GEARS), a method that integrates deep learning with a knowledge graph of gene–gene relationships to predict transcriptional responses to both single and multigene perturbations using single-cell RNA-sequencing data from perturbational screens. GEARS is able to predict outcomes of perturbing combinations consisting of genes that were never experimentally perturbed. GEARS exhibited 40% higher precision than existing approaches in predicting four distinct genetic interaction subtypes in a combinatorial perturbation screen and identified the strongest interactions twice as well as prior approaches. Overall, GEARS can predict phenotypically distinct effects of multigene perturbations and thus guide the design of perturbational experiments.

     
    more » « less
    Free, publicly-accessible full text available August 17, 2024
  9. Interferometric scattering microscopy can image the dynamics of nanometer-scale systems. The typical approach to analyzing interferometric images involves intensive processing, which discards data and limits the precision of measurements. We demonstrate an alternative approach: modeling the interferometric point spread function and fitting this model to data within a Bayesian framework. This approach yields best-fit parameters, including the particle’s three-dimensional position and polarizability, as well as uncertainties and correlations between these parameters. Building on recent work, we develop a model that is parameterized for rapid fitting. The model is designed to work with Hamiltonian Monte Carlo techniques that leverage automatic differentiation. We validate this approach by fitting the model to interferometric images of colloidal nanoparticles. We apply the method to track a diffusing particle in three dimensions, to directly infer the diffusion coefficient of a nanoparticle without calculating a mean-square displacement, and to quantify the ejection of DNA from an individual lambda phage virus, demonstrating that the approach can be used to infer both static and dynamic properties of nanoscale systems.

     
    more » « less
  10. Abstract

    Leptoquarks ($$\textrm{LQ}$$LQs) are hypothetical particles that appear in various extensions of the Standard Model (SM), that can explain observed differences between SM theory predictions and experimental results. The production of these particles has been widely studied at various experiments, most recently at the Large Hadron Collider (LHC), and stringent bounds have been placed on their masses and couplings, assuming the simplest beyond-SM (BSM) hypotheses. However, the limits are significantly weaker for$$\textrm{LQ}$$LQmodels with family non-universal couplings containing enhanced couplings to third-generation fermions. We present a new study on the production of a$$\textrm{LQ}$$LQat the LHC, with preferential couplings to third-generation fermions, considering proton-proton collisions at$$\sqrt{s} = 13 \, \textrm{TeV}$$s=13TeVand$$\sqrt{s} = 13.6 \, \textrm{TeV}$$s=13.6TeV. Such a hypothesis is well motivated theoretically and it can explain the recent anomalies in the precision measurements of$$\textrm{B}$$B-meson decay rates, specifically the$$R_{D^{(*)}}$$RD()ratios. Under a simplified model where the$$\textrm{LQ}$$LQmasses and couplings are free parameters, we focus on cases where the$$\textrm{LQ}$$LQdecays to a$$\tau $$τlepton and a$$\textrm{b}$$bquark, and study how the results are affected by different assumptions about chiral currents and interference effects with other BSM processes with the same final states, such as diagrams with a heavy vector boson,$$\textrm{Z}^{\prime }$$Z. The analysis is performed using machine learning techniques, resulting in an increased discovery reach at the LHC, allowing us to probe new physics phase space which addresses the$$\textrm{B}$$B-meson anomalies, for$$\textrm{LQ}$$LQmasses up to$$5.00\, \textrm{TeV}$$5.00TeV, for the high luminosity LHC scenario.

     
    more » « less