skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Statistical physics of large-scale neural activity with loops
As experiments advance to record from tens of thousands of neurons, statistical physics provides a framework for understanding how collective activity emerges from networks of fine-scale correlations. While modeling these populations is tractable in loop-free networks, neural circuitry inherently contains feedback loops of connectivity. Here, for a class of networks with loops, we present an exact solution to the maximum entropy problem that scales to very large systems. This solution provides direct access to information-theoretic measures like the entropy of the model and the information contained in correlations, which are usually inaccessible at large scales. In turn, this allows us to search for the optimal network of correlations that contains the maximum information about population activity. Applying these methods to 45 recordings of approximately 10,000 neurons in the mouse visual system, we demonstrate that our framework captures more information—providing a better description of the population—than existing methods without loops. For a given population, our models perform even better during visual stimulation than spontaneous activity; however, the inferred interactions overlap significantly, suggesting an underlying neural circuitry that remains consistent across stimuli. Generally, we construct an optimized framework for studying the statistical physics of large neural populations, with future applications extending to other biological networks.  more » « less
Award ID(s):
1734030
PAR ID:
10672193
Author(s) / Creator(s):
;
Publisher / Repository:
pnas.org
Date Published:
Journal Name:
Proceedings of the National Academy of Sciences
Volume:
122
Issue:
41
ISSN:
0027-8424
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Maximum-entropy methods provide a principled path connecting measurements of neural activity directly to statistical physics models, and this approach has been successful for populations of N 100 neurons. As N increases in new experiments, we enter an undersampled regime where we have to choose which observables should be constrained in the maximum-entropy construction. The best choice is the one that provides the greatest reduction in entropy, defining a “minimax entropy” principle. This principle becomes tractable if we restrict attention to correlations among pairs of neurons that link together into a tree; we can find the best tree efficiently, and the underlying statistical physics models are exactly solved. We use this approach to analyze experiments on N 1500 neurons in the mouse hippocampus, and we find that the resulting model captures key features of collective activity in the network. 
    more » « less
  2. null (Ed.)
    Neural codes for sensory inputs have been hypothesized to reside in a broader space defined by ongoing patterns of spontaneous activity. To understand the structure of this spontaneous activity in the olfactory system, we performed high-density recordings of neural populations in the main olfactory bulb of awake mice. We observed changes in pairwise correlations of spontaneous activity between mitral and tufted (M/T) cells when animals were running, which resulted in an increase in the entropy of the population. Surprisingly, pairwise maximum entropy models that described the population activity using only assumptions about the firing rates and correlations of neurons were better at predicting the global structure of activity when animals were stationary as compared to when they were running, implying that higher order (3rd, 4th order) interactions governed population activity during locomotion. Taken together, we found that locomotion alters the functional interactions that shape spontaneous population activity at the earliest stages of olfactory processing, one synapse away from the sensory receptors in the nasal epithelium. These data suggest that the coding space available for sensory representations responds adaptively to the animal’s behavioral state. NEW & NOTEWORTHY The organization and structure of spontaneous population activity in the olfactory system places constraints of how odor information is represented. Using high-density electrophysiological recordings of mitral and tufted cells, we found that running increases the dimensionality of spontaneous activity, implicating higher order interactions among neurons during locomotion. Behavior, thus, flexibly alters neuronal activity at the earliest stages of sensory processing. 
    more » « less
  3. null (Ed.)
    Abstract How is information distributed across large neuronal populations within a given brain area? Information may be distributed roughly evenly across neuronal populations, so that total information scales linearly with the number of recorded neurons. Alternatively, the neural code might be highly redundant, meaning that total information saturates. Here we investigate how sensory information about the direction of a moving visual stimulus is distributed across hundreds of simultaneously recorded neurons in mouse primary visual cortex. We show that information scales sublinearly due to correlated noise in these populations. We compartmentalized noise correlations into information-limiting and nonlimiting components, then extrapolate to predict how information grows with even larger neural populations. We predict that tens of thousands of neurons encode 95% of the information about visual stimulus direction, much less than the number of neurons in primary visual cortex. These findings suggest that the brain uses a widely distributed, but nonetheless redundant code that supports recovering most sensory information from smaller subpopulations. 
    more » « less
  4. Although much is known about how single neurons in the hippocampus represent an animal's position, how circuit interactions contribute to spatial coding is less well understood. Using a novel statistical estimator and theoretical modeling, both developed in the framework of maximum entropy models, we reveal highly structured CA1 cell-cell interactions in male rats during open field exploration. The statistics of these interactions depend on whether the animal is in a familiar or novel environment. In both conditions the circuit interactions optimize the encoding of spatial information, but for regimes that differ in the informativeness of their spatial inputs. This structure facilitates linear decodability, making the information easy to read out by downstream circuits. Overall, our findings suggest that the efficient coding hypothesis is not only applicable to individual neuron properties in the sensory periphery, but also to neural interactions in the central brain. SIGNIFICANCE STATEMENTLocal circuit interactions play a key role in neural computation and are dynamically shaped by experience. However, measuring and assessing their effects during behavior remains a challenge. Here, we combine techniques from statistical physics and machine learning to develop new tools for determining the effects of local network interactions on neural population activity. This approach reveals highly structured local interactions between hippocampal neurons, which make the neural code more precise and easier to read out by downstream circuits, across different levels of experience. More generally, the novel combination of theory and data analysis in the framework of maximum entropy models enables traditional neural coding questions to be asked in naturalistic settings. 
    more » « less
  5. The inference of neuronal connectome from large-scale neuronal activity recordings, such as two-photon Calcium imaging, represents an active area of research in computational neuroscience. In this work, we developed FARCI (Fast and Robust Connectome Inference), a MATLAB package for neuronal connectome inference from high-dimensional two-photon Calcium fluorescence data. We employed partial correlations as a measure of the functional association strength between pairs of neurons to reconstruct a neuronal connectome. We demonstrated using in silico datasets from the Neural Connectomics Challenge (NCC) and those generated using the state-of-the-art simulator of Neural Anatomy and Optimal Microscopy (NAOMi) that FARCI provides an accurate connectome and its performance is robust to network sizes, missing neurons, and noise levels. Moreover, FARCI is computationally efficient and highly scalable to large networks. In comparison with the best performing connectome inference algorithm in the NCC, Generalized Transfer Entropy (GTE), and Fluorescence Single Neuron and Network Analysis Package (FluoroSNNAP), FARCI produces more accurate networks over different network sizes, while providing significantly better computational speed and scaling. 
    more » « less