Abstract Objective . Neural decoding is an important tool in neural engineering and neural data analysis. Of various machine learning algorithms adopted for neural decoding, the recently introduced deep learning is promising to excel. Therefore, we sought to apply deep learning to decode movement trajectories from the activity of motor cortical neurons. Approach . In this paper, we assessed the performance of deep learning methods in three different decoding schemes, concurrent, time-delay, and spatiotemporal. In the concurrent decoding scheme where the input to the network is the neural activity coincidental to the movement, deep learning networks including artificial neural network (ANN) and long-short term memory (LSTM) were applied to decode movement and compared with traditional machine learning algorithms. Both ANN and LSTM were further evaluated in the time-delay decoding scheme in which temporal delays are allowed between neural signals and movements. Lastly, in the spatiotemporal decoding scheme, we trained convolutional neural network (CNN) to extract movement information from images representing the spatial arrangement of neurons, their activity, and connectomes (i.e. the relative strengths of connectivity between neurons) and combined CNN and ANN to develop a hybrid spatiotemporal network. To reveal the input features of the CNN in the hybrid network that deep learning discovered for movement decoding, we performed a sensitivity analysis and identified specific regions in the spatial domain. Main results . Deep learning networks (ANN and LSTM) outperformed traditional machine learning algorithms in the concurrent decoding scheme. The results of ANN and LSTM in the time-delay decoding scheme showed that including neural data from time points preceding movement enabled decoders to perform more robustly when the temporal relationship between the neural activity and movement dynamically changes over time. In the spatiotemporal decoding scheme, the hybrid spatiotemporal network containing the concurrent ANN decoder outperformed single-network concurrent decoders. Significance . Taken together, our study demonstrates that deep learning could become a robust and effective method for the neural decoding of behavior.
more »
« less
Low-dimensional dynamics for working memory and time encoding
Our decisions often depend on multiple sensory experiences separated by time delays. The brain can remember these experiences and, simultaneously, estimate the timing between events. To understand the mechanisms underlying working memory and time encoding, we analyze neural activity recorded during delays in four experiments on nonhuman primates. To disambiguate potential mechanisms, we propose two analyses, namely, decoding the passage of time from neural data and computing the cumulative dimensionality of the neural trajectory over time. Time can be decoded with high precision in tasks where timing information is relevant and with lower precision when irrelevant for performing the task. Neural trajectories are always observed to be low-dimensional. In addition, our results further constrain the mechanisms underlying time encoding as we find that the linear “ramping” component of each neuron’s firing rate strongly contributes to the slow timescale variations that make decoding time possible. These constraints rule out working memory models that rely on constant, sustained activity and neural networks with high-dimensional trajectories, like reservoir networks. Instead, recurrent networks trained with backpropagation capture the time-encoding properties and the dimensionality observed in the data.
more »
« less
- Award ID(s):
- 1707398
- PAR ID:
- 10247305
- Date Published:
- Journal Name:
- Proceedings of the National Academy of Sciences
- Volume:
- 117
- Issue:
- 37
- ISSN:
- 0027-8424
- Page Range / eLocation ID:
- 23021 to 23032
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
In the natural environment, we often form stable perceptual experiences from ambiguous and fleeting sensory inputs. Which neural activity underlies the content of perception and which neural activity supports perceptual stability remains an open question. We used a bistable perception paradigm involving ambiguous images to behaviorally dissociate perceptual content from perceptual stability, and magnetoencephalography to measure whole-brain neural dynamics in humans. Combining multivariate decoding and neural state-space analyses, we found frequency-band-specific neural signatures that underlie the content of perception and promote perceptual stability, respectively. Across different types of images, non-oscillatory neural activity in the slow cortical potential (<5 Hz) range supported the content of perception. Perceptual stability was additionally influenced by the amplitude of alpha and beta oscillations. In addition, neural activity underlying perceptual memory, which supports perceptual stability when sensory input is temporally removed from view, also encodes elapsed time. Together, these results reveal distinct neural mechanisms that support the content versus stability of visual perception.more » « less
-
Kurtz, Jurgen (Ed.)In neuroscience, delayed synaptic activity plays a pivotal and pervasive role in influencing synchronization, oscillation, and information-processing properties of neural networks. In small rhythm-generating networks, such as central pattern generators (CPGs), time-delays may regulate and determine the stability and variability of rhythmic activity, enabling organisms to adapt to environmental changes, and coordinate diverse locomotion patterns in both function and dysfunction. Here, we examine the dynamics of a three-cell CPG model in which time-delays are introduced into reciprocally inhibitory synapses between constituent neurons. We employ computational analysis to investigate the multiplicity and robustness of various rhythms observed in such multi-modal neural networks. Our approach involves deriving exhaustive two-dimensional Poincaré return maps for phase-lags between constituent neurons, where stable fixed points and invariant curves correspond to various phase-locked and phase-slipping/jitter rhythms. These rhythms emerge and disappear through various local (saddle-node, torus) and non-local (homoclinic) bifurcations, highlighting the multi-functionality (modality) observed in such small neural networks with fast inhibitory synapses.more » « less
-
null (Ed.)Neurophysiological recordings in behaving rodents demonstrate neuronal response properties that may code space and time for episodic memory and goal-directed behaviour. Here, we review recordings from hippocampus, entorhinal cortex, and retrosplenial cortex to address the problem of how neurons encode multiple overlapping spatiotemporal trajectories and disambiguate these for accurate memory-guided behaviour. The solution could involve neurons in the entorhinal cortex and hippocampus that show mixed selectivity, coding both time and location. Some grid cells and place cells that code space also respond selectively as time cells, allowing differentiation of time intervals when a rat runs in the same location during a delay period. Cells in these regions also develop new representations that differentially code the context of prior or future behaviour allowing disambiguation of overlapping trajectories. Spiking activity is also modulated by running speed and head direction, supporting the coding of episodic memory not as a series of snapshots but as a trajectory that can also be distinguished on the basis of speed and direction. Recent data also address the mechanisms by which sensory input could distinguish different spatial locations. Changes in firing rate reflect running speed on long but not short time intervals, and few cells code movement direction, arguing against path integration for coding location. Instead, new evidence for neural coding of environmental boundaries in egocentric coordinates fits with a modelling framework in which egocentric coding of barriers combined with head direction generates distinct allocentric coding of location. The egocentric input can be used both for coding the location of spatiotemporal trajectories and for retrieving specific viewpoints of the environment. Overall, these different patterns of neural activity can be used for encoding and disambiguation of prior episodic spatiotemporal trajectories or for planning of future goal-directed spatiotemporal trajectories.more » « less
-
Information coding by precise timing of spikes can be faster and more energy efficient than traditional rate coding. However, spike-timing codes are often brittle, which has limited their use in theoretical neuroscience and computing applications. Here, we propose a type of attractor neural network in complex state space and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping. Building on Hebbian neural associative memories, like Hopfield networks, we first propose threshold phasor associative memory (TPAM) networks. Complex phasor patterns whose components can assume continuous-valued phase angles and binary magnitudes can be stored and retrieved as stable fixed points in the network dynamics. TPAM achieves high memory capacity when storing sparse phasor patterns, and we derive the energy function that governs its fixed-point attractor dynamics. Second, we construct 2 spiking neural networks to approximate the complex algebraic computations in TPAM, a reductionist model with resonate-and-fire neurons and a biologically plausible network of integrate-and-fire neurons with synaptic delays and recurrently connected inhibitory interneurons. The fixed points of TPAM correspond to stable periodic states of precisely timed spiking activity that are robust to perturbation. The link established between rhythmic firing patterns and complex attractor dynamics has implications for the interpretation of spike patterns seen in neuroscience and can serve as a framework for computation in emerging neuromorphic devices.more » « less
An official website of the United States government

