skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Approximate Differentiable Likelihoods for Astroparticle Physics Experiments
Traditionally, inference in liquid xenon direct detection dark matter experiments has used estimators of event energy or density estimation of simulated data. Such methods have drawbacks compared to the computation of explicit likelihoods, such as an inability to conduct statistical inference in high-dimensional parameter spaces, or a failure to make use of all available information. In this work, we implement a continuous approximation of an event simulator model within a probabilistic programming framework, allowing for the application of high performance gradient-based inference methods such as the No-U-Turn Sampler. We demonstrate an improvement in inference results, with percent-level decreases in measurement uncertainties. Finally, in the case where some observables can be measured using multiple independent channels, such a method also enables the incorporation of additional information seamlessly, allowing for full use of the available information to be made.  more » « less
Award ID(s):
2046549
PAR ID:
10615391
Author(s) / Creator(s):
;
Publisher / Repository:
ACAT 2024
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Motivation In recent years, the well-known Infinite Sites Assumption (ISA) has been a fundamental feature of computational methods devised for reconstructing tumor phylogenies and inferring cancer progressions. However, recent studies leveraging Single-Cell Sequencing (SCS) techniques have shown evidence of the widespread recurrence and, especially, loss of mutations in several tumor samples. While there exist established computational methods that infer phylogenies with mutation losses, there remain some advancements to be made. Results We present SASC (Simulated Annealing Single-Cell inference): a new and robust approach based on simulated annealing for the inference of cancer progression from SCS data sets. In particular, we introduce an extension of the model of evolution where mutations are only accumulated, by allowing also a limited amount of mutation loss in the evolutionary history of the tumor: the Dollo-k model. We demonstrate that SASC achieves high levels of accuracy when tested on both simulated and real data sets and in comparison with some other available methods. Availability The Simulated Annealing Single-Cell inference (SASC) tool is open source and available at https://github.com/sciccolella/sasc. Supplementary information Supplementary data are available at Bioinformatics online. 
    more » « less
  2. Event cameras capture the world at high time resolution and with minimal bandwidth requirements. However, event streams, which only encode changes in brightness, do not contain sufficient scene information to support a wide variety of downstream tasks. In this work, we design generalized event cameras that inherently preserve scene intensity in a bandwidth-efficient manner. We generalize event cameras in terms of when an event is generated and what information is transmitted. To implement our designs, we turn to single-photon sensors that provide digital access to individual photon detections; this modality gives us the flexibility to realize a rich space of generalized event cameras. Our single-photon event cameras are capable of high-speed, high-fidelity imaging at low readout rates. Consequently, these event cameras can support plug-and-play downstream inference, without capturing new event datasets or designing specialized event-vision models. As a practical implication, our designs, which involve lightweight and near-sensor-compatible computations, provide a way to use single-photon sensors without exorbitant bandwidth costs. 
    more » « less
  3. Rafferty, A.; Whitehall, J.; Cristobal, R.; Cavalli-Sforza, V. (Ed.)
    We propose VarFA, a variational inference factor analysis framework that extends existing factor analysis models for educational data mining to efficiently output uncertainty estimation in the model's estimated factors. Such uncertainty information is useful, for example, for an adaptive testing scenario, where additional tests can be administered if the model is not quite certain about a students' skill level estimation. Traditional Bayesian inference methods that produce such uncertainty information are computationally expensive and do not scale to large data sets. VarFA utilizes variational inference which makes it possible to efficiently perform Bayesian inference even on very large data sets. We use the sparse factor analysis model as a case study and demonstrate the efficacy of VarFA on both synthetic and real data sets. VarFA is also very general and can be applied to a wide array of factor analysis models. 
    more » « less
  4. Randomized controlled trials (RCTs) are increasingly prevalent in education research, and are often regarded as a gold standard of causal inference. Two main virtues of randomized experiments are that they (1) do not suffer from confounding, thereby allowing for an unbiased estimate of an intervention's causal impact, and (2) allow for design-based inference, meaning that the physical act of randomization largely justifies the statistical assumptions made. However, RCT sample sizes are often small, leading to low precision; in many cases RCT estimates may be too imprecise to guide policy or inform science. Observational studies, by contrast, have strengths and weaknesses complementary to those of RCTs. Observational studies typically offer much larger sample sizes, but may suffer confounding. In many contexts, experimental and observational data exist side by side, allowing the possibility of integrating "big observational data" with "small but high-quality experimental data" to get the best of both. Such approaches hold particular promise in the field of education, where RCT sample sizes are often small due to cost constraints, but automatic collection of observational data, such as in computerized educational technology applications, or in state longitudinal data systems (SLDS) with administrative data on hundreds of thousand of students, has made rich, high-dimensional observational data widely available. We outline an approach that allows one to employ machine learning algorithms to learn from the observational data, and use the resulting models to improve precision in randomized experiments. Importantly, there is no requirement that the machine learning models are "correct" in any sense, and the final experimental results are guaranteed to be exactly unbiased. Thus, there is no danger of confounding biases in the observational data leaking into the experiment. 
    more » « less
  5. Summary The matched-filter technique is an effective way to detect repeats, or near-repeats, of a seismic source, but prior identification of an event from that source to use as a template is required. We propose a recursive matched-filter approach to systematically explore earthquake swarms, here applied to a swarm of volcanic long-period seismicity beneath Mount Sidley in Antarctica. We start with a single visually chosen template event with a high signal-to-noise ratio. We then extend our template database by selecting new templates to use in a subsequent matched-filter search from the newly detected set of events, allowing us to recursively expand the number of templates. We demonstrate that each iteration of the matched-filter search progressively extends the spatial coverage of our set of templates away from the original template event. In such a way, our proposed method overcomes the matched-filter search’s strictest constraint: that an event must already be identified to detect other similar events. Our recursive matched-filtering approach is well suited for the systematic exploration of earthquake swarms in both volcanic and tectonic contexts. 
    more » « less