The informationtheoretic limits of community detection have been studied extensively for network models with high levels of symmetry or homogeneity. The contribution of this paper is to study a broader class of network models that allow for variability in the sizes and behaviors of the different communities, and thus better reflect the behaviors observed in realworld networks. Our results show that the ability to detect communities can be described succinctly in terms of a matrix of effective signaltonoise ratios that provides a geometrical representation of the relationships between the different communities. This characterization follows from a matrix version of the IMMSE relationship and generalizes the concept of an effective scalar signaltonoise ratio introduced in previous work. We provide explicit formulas for the asymptotic pernode mutual information and upper bounds on the minimum meansquared error. The theoretical results are supported by numerical simulations.
Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels
This paper focuses on the mutual information and minimum meansquared error (MMSE) as a function a matrix valued signaltonoise ratio (SNR) for a linear Gaussian channel with arbitrary input distribution. As shown by Lamarca, the mutualinformation is a concave function of a positive semi definite matrix, which we call the matrix SNR. This implies that the mapping from the matrix SNR to the MMSE matrix is decreasing monotone. Building upon these functional properties, we start to construct a unifying framework that provides a bridge between classical informationtheoretic inequalities, such as the entropy power inequality, and interpolation techniques used in statistical physics and random matrix theory. This framework provides new insight into the structure of phase transitions in coding theory and compressed sensing. In particular, it is shown that the parallel combination of linear channels with freelyindependent matrices can be characterized succinctly via free convolution.
 Award ID(s):
 1718494
 Publication Date:
 NSFPAR ID:
 10064522
 Journal Name:
 IEEE International Symposium on Information Theory
 Sponsoring Org:
 National Science Foundation
More Like this


A key aspect of the neural coding problem is understanding how representations of afferent stimuli are built through the dynamics of learning and adaptation within neural networks. The infomax paradigm is built on the premise that such learning attempts to maximize the mutual information between input stimuli and neural activities. In this letter, we tackle the problem of such informationbased neural coding with an eye toward two conceptual hurdles. Specifically, we examine and then show how this form of coding can be achieved with online input processing. Our framework thus obviates the biological incompatibility of optimization methods that rely on global network awareness and batch processing of sensory signals. Central to our result is the use of variational bounds as a surrogate objective function, an established technique that has not previously been shown to yield online policies. We obtain learning dynamics for both linearcontinuous and discrete spiking neural encoding models under the umbrella of linear gaussian decoders. This result is enabled by approximating certain information quantities in terms of neuronal activity via pairwise feedback mechanisms. Furthermore, we tackle the problem of how such learning dynamics can be realized with strict energetic constraints. We show that endowing networks with auxiliary variablesmore »

We study the problem of community detection when there is covariate information about the node labels and one observes multiple correlated networks. We provide an asymptotic upper bound on the pernode mutual information as well as a heuristic analysis of a multivariate performance measure called the MMSE matrix. These results show that the combined effects of seemingly very different types of information can be characterized explicitly in terms of formulas involving lowdimensional estimation problems in additive Gaussian noise. Our analysis is supported by numerical simulations.

We consider the problem of estimating a $p$ dimensional vector $\beta$ from $n$ observations $Y=X\beta+W$ , where $\beta_{j}\mathop{\sim}^{\mathrm{i.i.d}.}\pi$ for a realvalued distribution $\pi$ with zero mean and unit varianceâ€™ $X_{ij}\mathop{\sim}^{\mathrm{i.i.d}.}\mathcal{N}(0,1)$ , and $W_{i}\mathop{\sim}^{\mathrm{i.i.d}.}\mathcal{N}(0,\ \sigma^{2})$ . In the asymptotic regime where $n/p\rightarrow\delta$ and $p/\sigma^{2}\rightarrow$ snr for two fixed constants $\delta,\ \mathsf{snr}\in(0,\ \infty)$ as $p\rightarrow\infty$ , the limiting (normalized) minimum meansquared error (MMSE) has been characterized by a singleletter (additive Gaussian scalar) channel. In this paper, we show that if the MMSE function of the singleletter channel converges to a step function, then the limiting MMSE of estimating $\beta$ converges to a step function which jumps from 1 to 0 at a critical threshold. Moreover, we establish that the limiting meansquared error of the (MSEoptimal) approximate message passing algorithm also converges to a step function with a larger threshold, providing evidence for the presence of a computationalstatistical gap between the two thresholds.

In this paper, we consider the amplifyandforward relay networks in mmWave systems and propose a hybrid precoder/combiner design approach. The phaseonly RF precoding/ combining matrices are first designed to support multistream transmission, where we compensate the phase for the eigenmodes of the channel. Then, the baseband precoders/combiners are performed to achieve the maximum mutual information. Based on the data processing inequality for the mutual information, we first jointly design the baseband source and relay nodes to maximize the mutual information before the destination baseband receiver. The proposed lowcomplexity iterative algorithm for the source and relay nodes is based on the equivalence between mutual information maximization and the weighted MMSE. After we obtain the optimal precoder and combiner for the source and relay nodes, we implement the MMSESIC filter at the baseband receiver to keep the mutual information unchanged, thus obtaining the optimal mutual information for the whole relay system. Simulation results show that our algorithm achieves better performance with lower complexity compared with other algorithms in the literature.