skip to main content


Title: Estimation of Differential Graphs via Log-Sum Penalized D-Trace Loss
We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure. The GGM structure is encoded in its precision (inverse covariance) matrix. In many applications one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data. Most existing methods for differential graph estimation are based on a lasso penalized loss function. In this paper, we analyze a log-sum penalized D-trace loss function approach for differential graph learning. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency in estimation in high-dimensional settings is provided. We illustrate our approach using a numerical example where log-sum penalized D-trace loss significantly outperforms lasso-penalized D-trace loss as well as smoothly clipped absolute deviation (SCAD) penalized D-trace loss.  more » « less
Award ID(s):
2040536
PAR ID:
10462221
Author(s) / Creator(s):
Date Published:
Journal Name:
2023 IEEE Statistical Signal Processing Workshop (SSP)
Page Range / eLocation ID:
240 to 244
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure. The GGM structure is encoded in its precision (inverse covariance) matrix. In many applications one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data. Existing methods for differential graph estimation are based on single-attribute models where one associates a scalar random variable with each node. In multi-attribute graphical models, each node represents a random vector. In this paper, we analyze a group lasso penalized D-trace loss function approach for differential graph learning from multi-attribute data. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency in support recovery and estimation in high-dimensional settings is provided. We illustrate our approach using a numerical example where the multi-attribute approach is shown to outperform a single-attribute approach. 
    more » « less
  2. We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure. The GGM structure is encoded in its precision (inverse covariance) matrix. In many applications one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data. Existing methods for differential graph estimation are based on single-attribute (SA) models where one associates a scalar random variable with each node. In multi-attribute (MA) graphical models, each node represents a random vector. In this paper, we analyze a group lasso penalized D-trace loss function approach for differential graph learning from multi-attribute data. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency in support recovery and estimation in high-dimensional settings is provided. Numerical results based on synthetic as well as real data are presented. 
    more » « less
  3. We consider the problem of estimating differences in two time series Gaussian graphical models (TSGGMs) which are known to have similar structure. The TSGGM structure is encoded in its inverse power spectral density (IPSD) just as the vector GGM structure is encoded in its precision (inverse covariance) matrix. Motivated by many applications, in existing works one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data comprised of independent and identically distributed observations. In this paper we consider estimation of the difference in two IPSD's to char-acterize underlying changes in conditional dependencies of two sets of time-dependent data. We analyze a group lasso penalized D-trace loss function approach in the frequency domain for differential graph learning, using Wirtinger calculus. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency of IPSD difference estimator in high-dimensional settings is presented. We illustrate our approach using a numerical example. 
    more » « less
  4. We consider the problem of estimating the structure of an undirected weighted sparse graph underlying a set of signals, exploiting both smoothness of the signals as well as their statistics. We augment the objective function of Kalofolias (2016) which is motivated by a signal smoothness viewpoint and imposes a Laplacian constraint, with a penalized log-likelihood objective function with a lasso constraint, motivated from a statistical viewpoint. Both of these objective functions are designed for estimation of sparse graphs. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the augmented objective function. Numerical results based on synthetic data show that the proposed approach improves upon Kalofolias (2016) in estimating the inverse covariance, and improves upon graphical lasso in estimating the graph topology. We also implement an adaptive version of the proposed algorithm following adaptive lasso of Zou (2006), and empirically show that it leads to further improvement in performance. 
    more » « less
  5. We consider the problem of estimating the structure of an undirected weighted sparse graphical model of multivariate data under the assumption that the underlying distribution is multivariate totally positive of order 2, or equivalently, all partial correlations are non-negative. Total positivity holds in several applications. The problem of Gaussian graphical model learning has been widely studied without the total positivity assumption where the problem can be formulated as estimation of the sparse precision matrix that encodes conditional dependence between random variables associated with the graph nodes. An approach that imposes total positivity is to assume that the precision matrix obeys the Laplacian constraints which include constraining the off-diagonal elements of the precision matrix to be non-positive. In this paper we investigate modifications to widely used penalized log-likelihood approaches to enforce total positivity but not the Laplacian structure. An alternating direction method of multipliers (ADMM) algorithm is presented for constrained optimization under total positivity and lasso as well as adaptive lasso penalties. Numerical results based on synthetic data show that the proposed constrained adaptive lasso approach significantly outperforms existing Laplacian-based approaches, both statistical and smoothness-based non-statistical. 
    more » « less