We present a general Bernoulli Gaussian scale mixture based approach for modeling priors that can represent a large class of random signals. For inference, we introduce belief propagation (BP) to multi-snapshot signal recovery based on the minimum mean square error estimation criteria. Our method relies on intra-snapshot messages that update the signal vector for each snapshot and inter-snapshot messages that share probabilistic information related to the common sparsity structure across snapshots. Despite the very general model, our BP method can efficiently compute accurate approximations of marginal posterior PDFs. Preliminary numerical results illustrate the superior convergence rate and improved performance of the proposed method compared to approaches based on sparse Bayesian learning (SBL).
more »
« less
This content will become publicly available on April 6, 2026
A Structured Neural Network Approach for Learning Improved Iterative Algorithms for SBL
Sparse Bayesian Learning (SBL) is a popular sparse signal recovery method, and various algorithms exist under the SBL paradigm. In this paper, we introduce a novel re-parameterization that allows the iterations of existing algorithms to be viewed as special cases of a unified and general mapping function. Furthermore, the re-parameterization enables an interesting beamforming interpretation that lends insights to all the considered algorithms. Utilizing the abstraction allowed by the general mapping viewpoint, we introduce a novel neural network architecture for learning improved iterative update rules under the SBL framework. Our modular design of the architecture enables the model to be independent of the size of the measurement matrix and provides us a unique opportunity to test the generalization capabilities across different measurement matrices. We show that the network when trained on a particular parameterized dictionary generalizes in many ways hitherto not possible; different measurement matrices, both type and dimension, and number of snapshots. Our numerical results showcase the generalization capability of our network in terms of mean square error and probability of support recovery across sparsity levels, different signal-to-noise ratios, number of snapshots and multiple measurement matrices of different sizes.
more »
« less
- PAR ID:
- 10599165
- Publisher / Repository:
- IEEE
- Date Published:
- ISBN:
- 979-8-3503-6874-1
- Page Range / eLocation ID:
- 1 to 5
- Format(s):
- Medium: X
- Location:
- Hyderabad, India
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We present a general Bernoulli Gaussian scale mixture based approach for modeling priors that can represent a large class of random signals. For inference, we introduce belief propagation (BP) to multi-snapshot signal recovery based on the minimum mean square error estimation criteria. Our method relies on intra-snapshot messages that update the signal vector for each snapshot and inter-snapshot messages that share probabilistic information related to the common sparsity structure across snapshots. Despite the very general model, our BP method can efficiently compute accurate approximations of marginal posterior PDFs. Preliminary numerical results illustrate the superior convergence rate and improved performance of the proposed method compared to approaches based on sparse Bayesian learning (SBL).more » « less
-
We present a Light-Weight Sequential Sparse Bayesian Learning (LWS-SBL) algorithm as an alternative to the orthogonal matching pursuit (OMP) algorithm for the general sparse signal recovery problem. The proposed approach formulates the recovery problem under the Type-II estimation framework and the stochastic maximum likelihood objective. We compare the computational complexity for the proposed algorithm with OMP and highlight the main differences. For the case of parametric dictionaries, a gridless version is developed by extending the proposed sequential SBL algorithm to locally optimize grid points near potential source locations and it is empirically shown that the performance approaches Cramer-Rao bound.´ Numerical results using the proposed approach demonstrate the support recovery performance improvements in different scenarios at a small computational price when compared to the OMP algorithm.more » « less
-
In this paper, we consider a general sparse recovery and blind demodulation model. Different from the ones in the literature, in our general model, each dictionary atom undergoes a distinct modulation process; we refer to this as non-stationary modulation. We also assume that the modulation matrices live in a known subspace. Through the lifting technique, the sparse recovery and blind demodulation problem can be reformulated as a column-wise sparse matrix recovery problem, and we are able to recover both the sparse source signal and a cluster of modulation matrices via atomic norm and the induced ` 2,1 norm minimizations. Moreover, we show that the sampling complexity for exact recovery is proportional to the number of degrees of freedom up to log factors in the noiseless case. We also bound the recovery error in terms of the norm of the noise when the observation is noisy. Numerical simulations are conducted to illustrate our results.more » « less
-
Abstract We introduce a lifted $$\ell _1$$ (LL1) regularization framework for the recovery of sparse signals. The proposed LL1 regularization is a generalization of several popular regularization methods in the field and is motivated by recent advancements in re-weighted $$\ell _1$$ approaches for sparse recovery. Through a comprehensive analysis of the relationships between existing methods, we identify two distinct types of lifting functions that guarantee equivalence to the $$\ell _0$$ minimization problem, which is a key objective in sparse signal recovery. To solve the LL1 regularization problem, we propose an algorithm based on the alternating direction method of multipliers and provide proof of convergence for the unconstrained formulation. Our experiments demonstrate the improved performance of the LL1 regularization compared with state-of-the-art methods, confirming the effectiveness of our proposed framework. In conclusion, the LL1 regularization presents a promising and flexible approach to sparse signal recovery and invites further research in this area.more » « less
An official website of the United States government
