A great many problems in network information theory, both fundamental and applied, involve determining a minimal set of inequalities linking the Shannon entropies of a certain collection of subsets of random variables. In principle this minimal set of inequalities could be determined from the entropy region, whose dimensions are all the subsets of random variables, by projecting out dimensions corresponding to the irrelevant subsets. As a general solution technique, however, this method is plagued both by the incompletely known nature of the entropy region as well as the exponential complexity of its bounds. Even worse, for four or more random variables, it is known that the set of linear information inequalities necessary to completely describe the entropy region must be uncountably infinite. A natural question arises then, if there are certain nontrivial collections of subsets where the inequalities linking only these subsets is both completely known, and have inequality descriptions that are linear in the number of random variables. This paper answers this question in the affirmative. A decomposition expressing the collection of inequalities linking a larger collection of subsets from that of smaller collections of subsets is first proven. This decomposition is then used to provide systems of subsets for which it both exhaustively determines the complete list of inequalities, which is linear in the number of variables.
more »
« less
Sharp MaximumEntropy Comparisons
We establish a family of sharp entropy inequalities with Gaussian extremizers. These inequalities hold for certain dependent random variables, namely entropymaximizing couplings subject to information constraints. Several wellknown results, such as the Zamir–Feder and Brunn–Minkowski inequalities, follow as special cases.
more »
« less
 NSFPAR ID:
 10303776
 Date Published:
 Journal Name:
 2021 IEEE International Symposium on Information Theory (ISIT)
 Format(s):
 Medium: X
 Sponsoring Org:
 National Science Foundation
More Like this


A bstract Quantum states with geometric duals are known to satisfy a stricter set of entropy inequalities than those obeyed by general quantum systems. The set of allowed entropies derived using the RyuTakayanagi (RT) formula defines the Holographic Entropy Cone (HEC). These inequalities are no longer satisfied once general quantum corrections are included by employing the Quantum Extremal Surface (QES) prescription. Nevertheless, the structure of the QES formula allows for a controlled study of how quantum contributions from bulk entropies interplay with HEC inequalities. In this paper, we initiate an exploration of this problem by relating bulk entropy constraints to boundary entropy inequalities. In particular, we show that requiring the bulk entropies to satisfy the HEC implies that the boundary entropies also satisfy the HEC. Further, we also show that requiring the bulk entropies to obey monogamy of mutual information (MMI) implies the boundary entropies also obey MMI.more » « less

We introduce a notion called entropic independence that is an entropic analog of spectral notions of highdimensional expansion. Informally, entropic independence of a background distribution $\mu$ on $k$sized subsets of a ground set of elements says that for any (possibly randomly chosen) set $S$, the relative entropy of a single element of $S$ drawn uniformly at random carries at most $O(1/k)$ fraction of the relative entropy of $S$. Entropic independence is the analog of the notion of spectral independence, if one replaces variance by entropy. We use entropic independence to derive tight mixing time bounds, overcoming the lossy nature of spectral analysis of Markov chains on exponentialsized state spaces. In our main technical result, we show a general way of deriving entropy contraction, a.k.a. modified logSobolev inequalities, for downup random walks from spectral notions. We show that spectral independence of a distribution under arbitrary external fields automatically implies entropic independence. We furthermore extend our theory to the case where spectral independence does not hold under arbitrary external fields. To do this, we introduce a framework for obtaining tight mixing time bounds for Markov chains based on what we call restricted modified logSobolev inequalities, which guarantee entropy contraction not for all distributions, but for those in a sufficiently large neighborhood of the stationary distribution. To derive our results, we relate entropic independence to properties of polynomials: $\mu$ is entropically independent exactly when a transformed version of the generating polynomial of $\mu$ is upper bounded by its linear tangent; this property is implied by concavity of the said transformation, which was shown by prior work to be locally equivalent to spectral independence. We apply our results to obtain (1) tight modified logSobolev inequalities and mixing times for multistep downup walks on fractionally logconcave distributions, (2) the tight mixing time of $O(n\log n)$ for Glauber dynamics on Ising models whose interaction matrix has eigenspectrum lying within an interval of length smaller than $1$, improving upon the prior quadratic dependence on $n$, and (3) nearlylinear time $\widetilde O_{\delta}(n)$ samplers for the hardcore and Ising models on $n$node graphs that have $\delta$relative gap to the treeuniqueness threshold. In the last application, our bound on the running time does not depend on the maximum degree $\Delta$ of the graph, and is therefore optimal even for highdegree graphs, and in fact, is sublinear in the size of the graph for highdegree graphs.more » « less

Trace inequalities are general techniques with many applications in quantum information theory, often replacing the classical functional calculus in noncommutative settings. The physics of quantum field theory and holography, however, motivates entropy inequalities in type III von Neumann algebras that lack a semifinite trace. The Haagerup and Kosaki Lp spaces enable reexpressing trace inequalities in nontracial von Neumann algebras. In particular, we show this for the generalized Araki–Lieb–Thirring and Golden–Thompson inequalities from the work of Sutter et al. [Commun. Math. Phys. 352(1), 37 (2017)]. Then, using the Haagerup approximation method, we prove a general von Neumann algebra version of universal recovery map corrections to the data processing inequality for relative entropy. We also show subharmonicity of a logarithmic pfidelity of recovery. Furthermore, we prove that the nondecrease of relative entropy is equivalent to the existence of an L1isometry implementing the channel on both input states.more » « less

null (Ed.)We establish a family of sharp entropy inequalities with Gaussian extremizers. These inequalities hold for certain de pendent random variables, namely entropymaximizing couplings subject to information constraints. Several wellknown results, such as the Zamir–Feder and Brunn–Minkowski inequalities, follow as special cases.more » « less