skip to main content

Search for: All records

Creators/Authors contains: "Wei, Shuangqing"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. In this paper, we want to find out the determining factors of Chernoff information in distinguishing a set of Gaussian graphs. We find that Chernoff information of two Gaussian graphs can be determined by the generalized eigenvalues of their covariance matrices. We find that the unit generalized eigenvalues do not affect Chernoff information and their corresponding dimensions do not provide information for classification purpose. In addition, we can provide a partial ordering using Chernoff information between a series of Gaussian trees connected by independent grafting operations. By exploiting relationship between generalized eigenvalues and Chernoff information, we can do optimal classification linear dimension reduction with least loss of information for classification.
  2. We formulate Wyner's common information for random vectors x ϵ R n with joint Gaussian density. We show that finding the common information of Gaussian vectors is equivalent to maximizing a log-determinant of the additive Gaussian noise covariance matrix. We coin such optimization problem as a constrained minimum determinant factor analysis (CMDFA) problem. The convexity of such problem with necessary and sufficient conditions on CMDFA solution is shown. We study the algebraic properties of CMDFA solution space, through which we study two sparse Gaussian graphical models, namely, latent Gaussian stars, and explicit Gaussian chains. Interestingly, we show that depending on pairwise covariance values in a Gaussian graphical structure, one may not always end up with the same parameters and structure for CMDFA solution as those found via graphical learning algorithms. In addition, our results suggest that Gaussian chains have little room left for dimension reduction in terms of the number of latent variables required in their CMDFA solutions.