skip to main content


Search for: All records

Creators/Authors contains: "Bazhenov, Maxim"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Brain rhythms of sleep reflect neuronal activity underlying sleep‐associated memory consolidation. The modulation of brain rhythms, such as the sleep slow oscillation (SO), is used both to investigate neurophysiological mechanisms as well as to measure the impact of sleep on presumed functional correlates. Previously, closed‐loop acoustic stimulation in humans targeted to the SO Up‐state successfully enhanced the slow oscillation rhythm and phase‐dependent spindle activity, although effects on memory retention have varied. Here, we aim to disclose relations between stimulation‐induced hippocampo‐thalamo‐cortical activity and retention performance on a hippocampus‐dependent object‐place recognition task in mice by applying acoustic stimulation at four estimated SO phases compared to sham condition. Across the 3‐h retention interval at the beginning of the light phase closed‐loop stimulation failed to improve retention significantly over sham. However, retention during SO Up‐state stimulation was significantly higher than for another SO phase. At all SO phases, acoustic stimulation was accompanied by a sharp increase in ripple activity followed by about a second‐long suppression of hippocampal sharp wave ripple and longer maintained suppression of thalamo‐cortical spindle activity. Importantly, dynamics of SO‐coupled hippocampal ripple activity distinguished SOUp‐state stimulation. Non‐rapid eye movement (NREM) sleep was not impacted by stimulation, yet preREM sleep duration was effected. Results reveal the complex effect of stimulation on the brain dynamics and support the use of closed‐loop acoustic stimulation in mice to investigate the inter‐regional mechanisms underlying memory consolidation.

     
    more » « less
    Free, publicly-accessible full text available August 21, 2024
  2. Cymbalyuk, Gennady S. (Ed.)

    Cortical slow oscillations (SOs) and thalamocortical sleep spindles are two prominent EEG rhythms of slow wave sleep. These EEG rhythms play an essential role in memory consolidation. In humans, sleep spindles are categorized into slow spindles (8–12 Hz) and fast spindles (12–16 Hz), with different properties. Slow spindles that couple with the up-to-down phase of the SO require more experimental and computational investigation to disclose their origin, functional relevance and most importantly their relation with SOs regarding memory consolidation. To examine slow spindles, we propose a biophysical thalamocortical model with two independent thalamic networks (one for slow and the other for fast spindles). Our modeling results show that fast spindles lead to faster cortical cell firing, and subsequently increase the amplitude of the cortical local field potential (LFP) during the SO down-to-up phase. Slow spindles also facilitate cortical cell firing, but the response is slower, thereby increasing the cortical LFP amplitude later, at the SO up-to-down phase of the SO cycle. Neither the SO rhythm nor the duration of the SO down state is affected by slow spindle activity. Furthermore, at a more hyperpolarized membrane potential level of fast thalamic subnetwork cells, the activity of fast spindles decreases, while the slow spindles activity increases. Together, our model results suggest that slow spindles may facilitate the initiation of the following SO cycle, without however affecting expression of the SO Up and Down states.

     
    more » « less
  3. Abstract Artificial neural networks are known to suffer from catastrophic forgetting: when learning multiple tasks sequentially, they perform well on the most recent task at the expense of previously learned tasks. In the brain, sleep is known to play an important role in incremental learning by replaying recent and old conflicting memory traces. Here we tested the hypothesis that implementing a sleep-like phase in artificial neural networks can protect old memories during new training and alleviate catastrophic forgetting. Sleep was implemented as off-line training with local unsupervised Hebbian plasticity rules and noisy input. In an incremental learning framework, sleep was able to recover old tasks that were otherwise forgotten. Previously learned memories were replayed spontaneously during sleep, forming unique representations for each class of inputs. Representational sparseness and neuronal activity corresponding to the old tasks increased while new task related activity decreased. The study suggests that spontaneous replay simulating sleep-like dynamics can alleviate catastrophic forgetting in artificial neural networks. 
    more » « less
  4. Bush, Daniel (Ed.)
    Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning. 
    more » « less
  5. Skoulakis, Efthimios M. (Ed.)
    Animals are constantly bombarded with stimuli, which presents a fundamental problem of sorting among pervasive uninformative stimuli and novel, possibly meaningful stimuli. We evaluated novelty detection behaviorally in honey bees as they position their antennae differentially in an air stream carrying familiar or novel odors. We then characterized neuronal responses to familiar and novel odors in the first synaptic integration center in the brain–the antennal lobes. We found that the neurons that exhibited stronger initial responses to the odor that was to be familiarized are the same units that later distinguish familiar and novel odors, independently of chemical identities. These units, including both tentative projection neurons and local neurons, showed a decreased response to the familiar odor but an increased response to the novel odor. Our results suggest that the antennal lobe may represent familiarity or novelty to an odor stimulus in addition to its chemical identity code. Therefore, the mechanisms for novelty detection may be present in early sensory processing, either as a result of local synaptic interaction or via feedback from higher brain centers. 
    more » « less
  6. Abstract Replay is the reactivation of one or more neural patterns that are similar to the activation patterns experienced during past waking experiences. Replay was first observed in biological neural networks during sleep, and it is now thought to play a critical role in memory formation, retrieval, and consolidation. Replay-like mechanisms have been incorporated in deep artificial neural networks that learn over time to avoid catastrophic forgetting of previous knowledge. Replay algorithms have been successfully used in a wide range of deep learning methods within supervised, unsupervised, and reinforcement learning paradigms. In this letter, we provide the first comprehensive comparison between replay in the mammalian brain and replay in artificial neural networks. We identify multiple aspects of biological replay that are missing in deep learning systems and hypothesize how they could be used to improve artificial neural networks. 
    more » « less
  7. null (Ed.)