This study addresses the challenge of reconstructing sparse signals, a frequent occurrence in the context of overdispersed photon-limited imaging. While the noise behavior in such imaging settings is typically modeled using a Poisson distribution, the negative binomial distribution is more suitable in overdispersed scenarios where the noise variance exceeds the signal mean. Knowledge of the maximum and minimum signal intensity can be effectively utilized within the computational framework to enhance the accuracy of signal reconstruction. In this paper, we use a gradient-based method for sparse signal recovery that leverages a negative binomial distribution for noise modeling, enforces bound constraints to adhere to upper and lower signal intensity thresholds, and employs a sparsity-promoting regularization term. The numerical experiments we present demonstrate that the incorporation of these features significantly improves the reconstruction of sparse signals from overdispersed measurements.
more »
« less
Negative Binomial Optimization for Low-Count Overdispersed Sparse Signal Reconstruction
Low-photon count imaging has been typically modeled by Poisson statistics. This discrete probability distribution model assumes that the mean and variance of a signal are equal. In the presence of greater variability in a dataset than what is expected, the negative binomial distribution is a suitable overdispersed alternative to the Poisson distribution. In this work, we present a framework for reconstructing sparse signals in these low-count overdispersed settings. Specifically, we describe a gradient-based sequential quadratic optimization approach that minimizes the negative log-likelihood corresponding to the negative binomial distribution coupled with a sparsity-promoting regularization term. Numerical experiments on 1D and 2D sparse/compressible signals are presented.
more »
« less
- PAR ID:
- 10505327
- Publisher / Repository:
- IEEE
- Date Published:
- Journal Name:
- 2023 European Signal Processing Conference (EUSIPCO)
- ISBN:
- 978-9-4645-9360-0
- Page Range / eLocation ID:
- 1948 to 1952
- Format(s):
- Medium: X
- Location:
- Helsinki, Finland
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This paper investigates the application of the ℓp quasinorm, where 0 < p < 1, in contexts characterized by photon-limited signals such as medical imaging and night vision. In these environments, low-photon count images have typically been modeled using Poisson statistics. In related algorithms, the ℓ1 norm is commonly employed as a regularization method to promotes sparsity in the reconstruction. However, recent research suggests that using the ℓp quasi-norm may yield lower error results. In this paper, we investigate the use of negative binomial statistics, which are more general models than Poisson models, in conjunction with the ℓp quasi-norm for recovering sparse signals in low-photon count imaging settings.more » « less
-
null (Ed.)We extend network tomography to traffic flows that are not necessarily Poisson random processes. This assumption has governed the field since its inception in 1996 by Y. Vardi. We allow the distribution of the packet count of each traffic flow in a given time interval to be a mixture of Poisson random variables. Both discrete as well as continuous mixtures are studied. For the latter case, we focus on mixed Poisson distributions with Gamma mixing distribution. As is well known, this mixed Poisson distribution is the negative binomial distribution. Other mixing distributions, such as Wald or the inverse Gaussian distribution can be used. Mixture distributions are overdispersed with variance larger than the mean. Thus, they are more suitable for Internet traffic than the Poisson model. We develop a second-order moment matching approach for estimating the mean traffic rate for each source-destination pair using least squares and the minimum I-divergence iterative procedure. We demonstrate the performance of the proposed approach by several numerical examples. The results show that the averaged normalized mean squared error in rate estimation is of the same order as in the classic Poisson based network tomography. Furthermore, no degradation in performance was observed when traffic rates are Poisson but Poisson mixtures are assumed.more » « less
-
Abstract Precision medicine aims for personalized prognosis and therapeutics by utilizing recent genome-scale high-throughput profiling techniques, including next-generation sequencing (NGS). However, translating NGS data faces several challenges. First, NGS count data are often overdispersed, requiring appropriate modeling. Second, compared to the number of involved molecules and system complexity, the number of available samples for studying complex disease, such as cancer, is often limited, especially considering disease heterogeneity. The key question is whether we may integrate available data from all different sources or domains to achieve reproducible disease prognosis based on NGS count data. In this paper, we develop a Bayesian Multi-Domain Learning (BMDL) model that derives domain-dependent latent representations of overdispersed count data based on hierarchical negative binomial factorization for accurate cancer subtyping even if the number of samples for a specific cancer type is small. Experimental results from both our simulated and NGS datasets from The Cancer Genome Atlas (TCGA) demonstrate the promising potential of BMDL for effective multi-domain learning without negative transfer effects often seen in existing multi-task learning and transfer learning methods.more » « less
-
Background: Outcome measures that are count variables with excessive zeros are common in health behaviors research. Examples include the number of standard drinks consumed or alcohol‐related problems experienced over time. There is a lack of empirical data about the relative performance of prevailing statistical models for assessing the efficacy of interventions when outcomes are zero‐inflated, particularly compared with recently developed marginalized count regression approaches for such data.Methods: The current simulation study examined five commonly used approaches for analyzing count outcomes, including two linear models (with outcomes on raw and log‐transformed scales, respectively) and three prevailing count distribution‐based models (ie, Poisson, negative binomial, and zero‐inflated Poisson (ZIP) models). We also considered the marginalized zero‐inflated Poisson (MZIP) model, a novel alternative that estimates the overall effects on the population mean while adjusting for zero‐inflation. Motivated by alcohol misuse prevention trials, extensive simulations were conducted to evaluate and compare the statistical power and Type I error rate of the statistical models and approaches across data conditions that varied in sample size ( to 500), zero rate (0.2 to 0.8), and intervention effect sizes.Results: Under zero‐inflation, the Poisson model failed to control the Type I error rate, resulting in higher than expected false positive results. When the intervention effects on the zero (vs. non‐zero) and count parts were in the same direction, the MZIP model had the highest statistical power, followed by the linear model with outcomes on the raw scale, negative binomial model, and ZIP model. The performance of the linear model with a log‐transformed outcome variable was unsatisfactory.Conclusions: The MZIP model demonstrated better statistical properties in detecting true intervention effects and controlling false positive results for zero‐inflated count outcomes. This MZIP model may serve as an appealing analytical approach to evaluating overall intervention effects in studies with count outcomes marked by excessive zeros.more » « less
An official website of the United States government

