skip to main content


Search for: All records

Award ID contains: 2040536

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure. The GGM structure is encoded in its precision (inverse covariance) matrix. In many applications one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data. Existing methods for differential graph estimation are based on single-attribute (SA) models where one associates a scalar random variable with each node. In multi-attribute (MA) graphical models, each node represents a random vector. In this paper, we analyze a group lasso penalized D-trace loss function approach for differential graph learning from multi-attribute data. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency in support recovery and estimation in high-dimensional settings is provided. Numerical results based on synthetic as well as real data are presented. 
    more » « less
    Free, publicly-accessible full text available December 15, 2024
  2. We consider the problem of inferring the conditional independence graph (CIG) of a sparse, high-dimensional, stationary matrix-variate Gaussian time series. All past work on matrix graphical models assume that i.i.d. observations of matrix-variate are available. Here we allow dependent observations. We consider a sparse-group lasso based frequency-domain formulation of the problem with a Kronecker-decomposable power spectral density (PSD), and solve it via an alternating direction method of multipliers (ADMM) approach. The problem is bi-convex which is solved via flip-flop optimization. We provide sufficient conditions for local convergence in the Frobenius norm of the inverse PSD estimators to the true value. This results also yields a rate of convergence. We illustrate our approach using numerical examples. 
    more » « less
    Free, publicly-accessible full text available July 2, 2024
  3. We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure. The GGM structure is encoded in its precision (inverse covariance) matrix. In many applications one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data. Most existing methods for differential graph estimation are based on a lasso penalized loss function. In this paper, we analyze a log-sum penalized D-trace loss function approach for differential graph learning. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency in estimation in high-dimensional settings is provided. We illustrate our approach using a numerical example where log-sum penalized D-trace loss significantly outperforms lasso-penalized D-trace loss as well as smoothly clipped absolute deviation (SCAD) penalized D-trace loss. 
    more » « less
    Free, publicly-accessible full text available July 2, 2024
  4. We consider the problem of estimating differences in two Gaussian graphical models (GGMs) which are known to have similar structure. The GGM structure is encoded in its precision (inverse covariance) matrix. In many applications one is interested in estimating the difference in two precision matrices to characterize underlying changes in conditional dependencies of two sets of data. Existing methods for differential graph estimation are based on single-attribute models where one associates a scalar random variable with each node. In multi-attribute graphical models, each node represents a random vector. In this paper, we analyze a group lasso penalized D-trace loss function approach for differential graph learning from multi-attribute data. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the objective function. Theoretical analysis establishing consistency in support recovery and estimation in high-dimensional settings is provided. We illustrate our approach using a numerical example where the multi-attribute approach is shown to outperform a single-attribute approach. 
    more » « less
    Free, publicly-accessible full text available June 4, 2024
  5. We consider the problem of inferring the conditional independence graph (CIG) of a high-dimensional stationary, multivariate long-range dependent (LRD) Gaussian time series. In a time series graph, each component of the vector series is represented by a distinct node, and associations between components are represented by edges between the corresponding nodes. In a recent work on graphical modeling of short-range dependent (SRD) Gaussian time series, the problem was cast as one of multi-attribute graph estimation for random vectors where a vector is associated with each node of the graph. At each node, the associated random vector consists of a time series component and its delayed copies. A theoretical analysis based on short-range dependence has been given in Tugnait (2022 ICASSP). In this paper we analyze this approach for LRD Gaussian time series and provide consistency results regarding convergence in the Frobenius norm of the inverse covariance matrix associated with the multi-attribute graph. 
    more » « less
  6. We consider the problem of inferring the conditional independence graph (CIG) of a high-dimensional stationary multivariate Gaussian time series. A sparse-group lasso based frequency-domain formulation of the problem has been considered in the literature where the objective is to estimate the sparse inverse power spectral density (PSD) of the data. The CIG is then inferred from the estimated inverse PSD. In this paper we investigate use of a sparse-group log-sum penalty (LSP) instead of sparse-group lasso penalty. An alternating direction method of multipliers (ADMM) approach for iterative optimization of the non-convex problem is presented. We provide sufficient conditions for local convergence in the Frobenius norm of the inverse PSD estimators to the true value. This results also yields a rate of convergence. We illustrate our approach using numerical examples utilizing both synthetic and real data. 
    more » « less
  7. We consider the problem of inferring the conditional independence graph (CIG) of a high-dimensional stationary multivariate Gaussian time series. In a time series graph, each component of the vector series is represented by distinct node, and associations between components are represented by edges between the corresponding nodes. We formulate the problem as one of multi-attribute graph estimation for random vectors where a vector is associated with each node of the graph. At each node, the associated random vector consists of a time series component and its delayed copies. We present an alternating direction method of multipliers (ADMM) solution to minimize a sparse-group lasso penalized negative pseudo log-likelihood objective function to estimate the precision matrix of the random vector associated with the entire multi-attribute graph. The time series CIG is then inferred from the estimated precision matrix. A theoretical analysis is provided. Numerical results illustrate the proposed approach which outperforms existing frequency-domain approaches in correctly detecting the graph edges. 
    more » « less
  8. We consider the problem of estimating the structure of an undirected weighted sparse graph underlying a set of signals, exploiting both smoothness of the signals as well as their statistics. We augment the objective function of Kalofolias (2016) which is motivated by a signal smoothness viewpoint and imposes a Laplacian constraint, with a penalized log-likelihood objective function with a lasso constraint, motivated from a statistical viewpoint. Both of these objective functions are designed for estimation of sparse graphs. An alternating direction method of multipliers (ADMM) algorithm is presented to optimize the augmented objective function. Numerical results based on synthetic data show that the proposed approach improves upon Kalofolias (2016) in estimating the inverse covariance, and improves upon graphical lasso in estimating the graph topology. We also implement an adaptive version of the proposed algorithm following adaptive lasso of Zou (2006), and empirically show that it leads to further improvement in performance. 
    more » « less
  9. We consider the problem of estimating the structure of an undirected weighted sparse graphical model of multivariate data under the assumption that the underlying distribution is multivariate totally positive of order 2, or equivalently, all partial correlations are non-negative. Total positivity holds in several applications. The problem of Gaussian graphical model learning has been widely studied without the total positivity assumption where the problem can be formulated as estimation of the sparse precision matrix that encodes conditional dependence between random variables associated with the graph nodes. An approach that imposes total positivity is to assume that the precision matrix obeys the Laplacian constraints which include constraining the off-diagonal elements of the precision matrix to be non-positive. In this paper we investigate modifications to widely used penalized log-likelihood approaches to enforce total positivity but not the Laplacian structure. An alternating direction method of multipliers (ADMM) algorithm is presented for constrained optimization under total positivity and lasso as well as adaptive lasso penalties. Numerical results based on synthetic data show that the proposed constrained adaptive lasso approach significantly outperforms existing Laplacian-based approaches, both statistical and smoothness-based non-statistical. 
    more » « less