skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Linear Complexity Entropy Regions
A great many problems in network information theory, both fundamental and applied, involve determining a minimal set of inequalities linking the Shannon entropies of a certain collection of subsets of random variables. In principle this minimal set of inequalities could be determined from the entropy region, whose dimensions are all the subsets of random variables, by projecting out dimensions corresponding to the irrelevant subsets. As a general solution technique, however, this method is plagued both by the incompletely known nature of the entropy region as well as the exponential complexity of its bounds. Even worse, for four or more random variables, it is known that the set of linear information inequalities necessary to completely describe the entropy region must be uncountably infinite. A natural question arises then, if there are certain nontrivial collections of subsets where the inequalities linking only these subsets is both completely known, and have inequality descriptions that are linear in the number of random variables. This paper answers this question in the affirmative. A decomposition expressing the collection of inequalities linking a larger collection of subsets from that of smaller collections of subsets is first proven. This decomposition is then used to provide systems of subsets for which it both exhaustively determines the complete list of inequalities, which is linear in the number of variables.  more » « less
Award ID(s):
1812965
PAR ID:
10316108
Author(s) / Creator(s):
;
Date Published:
Journal Name:
2021 IEEE International Symposium on Information Theory (ISIT)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We establish a family of sharp entropy inequalities with Gaussian extremizers. These inequalities hold for certain dependent random variables, namely entropy-maximizing couplings subject to information constraints. Several well-known results, such as the Zamir–Feder and Brunn–Minkowski inequalities, follow as special cases. 
    more » « less
  2. null (Ed.)
    We establish a family of sharp entropy inequalities with Gaussian extremizers. These inequalities hold for certain de- pendent random variables, namely entropy-maximizing couplings subject to information constraints. Several well-known results, such as the Zamir–Feder and Brunn–Minkowski inequalities, follow as special cases. 
    more » « less
  3. Summary We consider the problem of testing for the presence of linear relationships between large sets of random variables based on a postselection inference approach to canonical correlation analysis. The challenge is to adjust for the selection of subsets of variables having linear combinations with maximal sample correlation. To this end, we construct a stabilized one-step estimator of the Euclidean norm of the canonical correlations maximized over subsets of variables of prespecified cardinality. This estimator is shown to be consistent for its target parameter and asymptotically normal, provided the dimensions of the variables do not grow too quickly with sample size. We also develop a greedy search algorithm to accurately compute the estimator, leading to a computationally tractable omnibus test for the global null hypothesis that there are no linear relationships between any subsets of variables having the prespecified cardinality. We further develop a confidence interval that takes the variable selection into account. 
    more » « less
  4. We introduce a notion called entropic independence that is an entropic analog of spectral notions of high-dimensional expansion. Informally, entropic independence of a background distribution $$\mu$$ on $$k$$-sized subsets of a ground set of elements says that for any (possibly randomly chosen) set $$S$$, the relative entropy of a single element of $$S$$ drawn uniformly at random carries at most $O(1/k)$ fraction of the relative entropy of $$S$$. Entropic independence is the analog of the notion of spectral independence, if one replaces variance by entropy. We use entropic independence to derive tight mixing time bounds, overcoming the lossy nature of spectral analysis of Markov chains on exponential-sized state spaces. In our main technical result, we show a general way of deriving entropy contraction, a.k.a. modified log-Sobolev inequalities, for down-up random walks from spectral notions. We show that spectral independence of a distribution under arbitrary external fields automatically implies entropic independence. We furthermore extend our theory to the case where spectral independence does not hold under arbitrary external fields. To do this, we introduce a framework for obtaining tight mixing time bounds for Markov chains based on what we call restricted modified log-Sobolev inequalities, which guarantee entropy contraction not for all distributions, but for those in a sufficiently large neighborhood of the stationary distribution. To derive our results, we relate entropic independence to properties of polynomials: $$\mu$$ is entropically independent exactly when a transformed version of the generating polynomial of $$\mu$$ is upper bounded by its linear tangent; this property is implied by concavity of the said transformation, which was shown by prior work to be locally equivalent to spectral independence. We apply our results to obtain (1) tight modified log-Sobolev inequalities and mixing times for multi-step down-up walks on fractionally log-concave distributions, (2) the tight mixing time of $$O(n\log n)$$ for Glauber dynamics on Ising models whose interaction matrix has eigenspectrum lying within an interval of length smaller than $$1$$, improving upon the prior quadratic dependence on $$n$$, and (3) nearly-linear time $$\widetilde O_{\delta}(n)$$ samplers for the hardcore and Ising models on $$n$$-node graphs that have $$\delta$$-relative gap to the tree-uniqueness threshold. In the last application, our bound on the running time does not depend on the maximum degree $$\Delta$$ of the graph, and is therefore optimal even for high-degree graphs, and in fact, is sublinear in the size of the graph for high-degree graphs. 
    more » « less
  5. Recent work of C. Fefferman and the first author [8] has demonstrated that the linear system of equations has a solution if and only if satisfy a certain finite collection of partial differential equations. Here, the are fixed semialgebraic functions. In this paper, we consider the analogous problem for systems of linear inequalities: Our main result is a negative one, demonstrated by counterexample: the existence of a solution F may not, in general, be determined via an analogous finite set of partial differential inequalities in . 
    more » « less