skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Light-Weight Sequential SBL Algorithm: An Alternative to OMP
We present a Light-Weight Sequential Sparse Bayesian Learning (LWS-SBL) algorithm as an alternative to the orthogonal matching pursuit (OMP) algorithm for the general sparse signal recovery problem. The proposed approach formulates the recovery problem under the Type-II estimation framework and the stochastic maximum likelihood objective. We compare the computational complexity for the proposed algorithm with OMP and highlight the main differences. For the case of parametric dictionaries, a gridless version is developed by extending the proposed sequential SBL algorithm to locally optimize grid points near potential source locations and it is empirically shown that the performance approaches Cramer-Rao bound.´ Numerical results using the proposed approach demonstrate the support recovery performance improvements in different scenarios at a small computational price when compared to the OMP algorithm.  more » « less
Award ID(s):
2124929 2225617
PAR ID:
10417294
Author(s) / Creator(s):
;
Date Published:
Journal Name:
ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes Island, Greece, 2023
Page Range / eLocation ID:
1 to 5
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Sparse basis recovery is a classical and important statistical learning problem when the number of model dimensions p is much larger than the number of samples n. However, there has been little work that studies sparse basis recovery in the Federated Learning (FL) setting, where the client data’s differential privacy (DP) must also be simultaneously protected. In particular, the performance guarantees of existing DP-FL algorithms (such as DP-SGD) will degrade significantly when p >> n, and thus, they will fail to learn the true underlying sparse model accurately. In this work, we develop a new differentially private sparse basis recovery algorithm for the FL setting, called SPriFed-OMP. SPriFed-OMP converts OMP (Orthogonal Matching Pursuit) to the FL setting. Further, it combines SMPC (secure multi-party computation) and DP to ensure that only a small amount of noise needs to be added in order to achieve differential privacy. As a result, SPriFed-OMP can efficiently recover the true sparse basis for a linear model with only O(sqrt(p)) samples. We further present an enhanced version of our approach, SPriFed-OMP-GRAD based on gradient privatization, that improves the performance of SPriFed-OMP. Our theoretical analysis and empirical results demonstrate that both SPriFed-OMP and SPriFed-OMP-GRAD terminate in a small number of steps, and they significantly outperform the previous state-of-the-art DP-FL solutions in terms of the accuracy-privacy trade-off. 
    more » « less
  2. In this paper, we revisit the framework for maximum likelihood estimation (MLE) as applied to parametric models with an aim to estimate the parameter of interest in a gridless manner. The approach has inherent connections to the sparse Bayesian learning (SBL) formulation, and naturally leads to the problem of structured matrix recovery (SMR). We therefore pose the parameter estimation problem as a SMR problem, and recover the parameter of interest by appealing to the Carathéodory-Fejér result on decomposition of positive semi-definite Toeplitz matrices. We propose an iterative algorithm to estimate the structured covariance matrix; each iteration solves a semi-definite program. We numerically compare the performance with other gridless schemes in literature and demonstrate the superior performance of the proposed technique 
    more » « less
  3. This paper introduces new and practically relevant non-Gaussian priors for the Sparse Bayesian Learning (SBL) framework applied to the Multiple Measurement Vector (MMV) problem. We extend the Gaussian Scale Mixture (GSM) framework to model prior distributions for row vectors, exploring the use of shared and different hyperparameters across different measurements. We propose Expectation Maximization (EM) based algorithms to estimate the parameters of the prior density along with the hyperparameters. To promote sparsity more effectively in a non-Gaussian setting, we show the importance of incorporating learning of the parameters of the mixing density. Such an approach effectively utilizes the common support notion in the MMV problem and promotes sparsity without explicitly imposing a sparsity-promoting prior, indicating the methods’ robustness to model mismatches. Numerical simulations are provided to compare the proposed approaches with the existing SBL algorithm for the MMV problem. 
    more » « less
  4. Abstract We introduce a lifted $$\ell _1$$ (LL1) regularization framework for the recovery of sparse signals. The proposed LL1 regularization is a generalization of several popular regularization methods in the field and is motivated by recent advancements in re-weighted $$\ell _1$$ approaches for sparse recovery. Through a comprehensive analysis of the relationships between existing methods, we identify two distinct types of lifting functions that guarantee equivalence to the $$\ell _0$$ minimization problem, which is a key objective in sparse signal recovery. To solve the LL1 regularization problem, we propose an algorithm based on the alternating direction method of multipliers and provide proof of convergence for the unconstrained formulation. Our experiments demonstrate the improved performance of the LL1 regularization compared with state-of-the-art methods, confirming the effectiveness of our proposed framework. In conclusion, the LL1 regularization presents a promising and flexible approach to sparse signal recovery and invites further research in this area. 
    more » « less
  5. We present a general Bernoulli Gaussian scale mixture based approach for modeling priors that can represent a large class of random signals. For inference, we introduce belief propagation (BP) to multi-snapshot signal recovery based on the minimum mean square error estimation criteria. Our method relies on intra-snapshot messages that update the signal vector for each snapshot and inter-snapshot messages that share probabilistic information related to the common sparsity structure across snapshots. Despite the very general model, our BP method can efficiently compute accurate approximations of marginal posterior PDFs. Preliminary numerical results illustrate the superior convergence rate and improved performance of the proposed method compared to approaches based on sparse Bayesian learning (SBL). 
    more » « less