skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A unified theory for the origin of grid cells through the lens of pattern formation.
Grid cells in the brain fire in strikingly regular hexagonal patterns across space. There are currently two seemingly unrelated frameworks for understanding these patterns. Mechanistic models account for hexagonal firing fields as the result of pattern-forming dynamics in a recurrent neural network with hand-tuned center-surround connectivity. Normative models specify a neural architecture, a learning rule, and a navigational task, and observe that grid-like firing fields emerge due to the constraints of solving this task. Here we provide an analytic theory that unifies the two perspectives by casting the learning dynamics of neural networks trained on navigational tasks as a pattern forming dynamical system. This theory provides insight into the optimal solutions of diverse formulations of the normative task, and shows that symmetries in the representation of space correctly predict the structure of learned firing fields in trained neural networks. Further, our theory proves that a nonnegativity constraint on firing rates induces a symmetry-breaking mechanism which favors hexagonal firing fields. We extend this theory to the case of learning multiple grid maps and demonstrate that optimal solutions consist of a hierarchy of maps with increasing length scales. These results unify previous accounts of grid cell firing and provide a novel framework for predicting the learned representations of recurrent neural networks.  more » « less
Award ID(s):
1845166
PAR ID:
10291261
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Advances in neural information processing systems
Volume:
32
ISSN:
1049-5258
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Spatial periodicity in grid cell firing has been interpreted as a neural metric for space providing animals with a coordinate system in navigating physical and mental spaces. However, the specific computational problem being solved by grid cells has remained elusive. Here, we provide mathematical proof that spatial periodicity in grid cell firing is the only possible solution to a neural sequence code of 2-D trajectories and that the hexagonal firing pattern of grid cells is the most parsimonious solution to such a sequence code. We thereby provide a likely teleological cause for the existence of grid cells and reveal the underlying nature of the global geometric organization in grid maps as a direct consequence of a simple local sequence code. A sequence code by grid cells provides intuitive explanations for many previously puzzling experimental observations and may transform our thinking about grid cells. 
    more » « less
  2. Learned movements can be skillfully performed at different paces. What neural strategies produce this flexibility? Can they be predicted and understood by network modeling? We trained monkeys to perform a cycling task at different speeds, and trained artificial recurrent networks to generate the empirical muscle-activity patterns. Network solutions reflected the principle that smooth well-behaved dynamics require low trajectory tangling. Network solutions had a consistent form, which yielded quantitative and qualitative predictions. To evaluate predictions, we analyzed motor cortex activity recorded during the same task. Responses supported the hypothesis that the dominant neural signals reflect not muscle activity, but network-level strategies for generating muscle activity. Single-neuron responses were better accounted for by network activity than by muscle activity. Similarly, neural population trajectories shared their organization not with muscle trajectories, but with network solutions. Thus, cortical activity could be understood based on the need to generate muscle activity via dynamics that allow smooth, robust control over movement speed. 
    more » « less
  3. Aljadeff, Johnatan (Ed.)
    Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks. 
    more » « less
  4. As a model of recurrent spiking neural networks, the Liquid State Machine (LSM) offers a powerful brain-inspired computing platform for pattern recognition and machine learning applications. While operated by processing neural spiking activities, the LSM naturally lends itself to an efficient hardware implementation via exploration of typical sparse firing patterns emerged from the recurrent neural network and smart processing of computational tasks that are orchestrated by different firing events at runtime. We explore these opportunities by presenting a LSM processor architecture with integrated on-chip learning and its FPGA implementation. Our LSM processor leverage the sparsity of firing activities to allow for efficient event-driven processing and activity-dependent clock gating. Using the spoken English letters adopted from the TI46 [1] speech recognition corpus as a benchmark, we show that the proposed FPGA-based neural processor system is up to 29% more energy efficient than a baseline LSM processor with little extra hardware overhead. 
    more » « less
  5. Sharpee, T (Ed.)
    Abstract Grid cells play a principal role in enabling cognitive representations of ambient environments. The key property of these cells—the regular arrangement of their firing fields—is commonly viewed as a means for establishing spatial scales or encoding specific locations. However, using grid cells’ spiking outputs for deducing geometric orderliness proves to be a strenuous task due to fairly irregular activation patterns triggered by the animal’s sporadic visits to the grid fields. This article addresses statistical mechanisms enabling emergent regularity of grid cell firing activity from the perspective of percolation theory. Using percolation phenomena for modeling the effect of the rat’s moves through the lattices of firing fields sheds new light on the mechanisms of spatial information processing, spatial learning, path integration, and establishing spatial metrics. It is also shown that physiological parameters required for spiking percolation match the experimental range, including the characteristic 2/3 ratio between the grid fields’ size and the grid spacing, pointing at a biological viability of the approach. 
    more » « less