Habituation and sensitization represent nonassociative learning mechanisms in both non‐neural and neural organisms. They are essential for a range of functions from survival to adaptation in dynamic environments. Design of hardware for neuroinspired computing strives to emulate such features driven by electric bias and can also be incorporated into neural network algorithms. Herein, cellular‐like learning in oxygen‐deficient NiO
- Award ID(s):
- 1904097
- PAR ID:
- 10407100
- Date Published:
- Journal Name:
- Proceedings of the National Academy of Sciences
- Volume:
- 118
- Issue:
- 39
- ISSN:
- 0027-8424
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
x devices is demonstrated. Both habituation learning and sensitization response can be achieved in a single device by simply controlling the magnitude of the electric field. Spontaneous memory relaxations and dynamic redistribution of oxygen vacancies under electric bias enable such learning behavior of NiOx under sequential training. These characteristics in simple device arrays are implemented to learn alphabets as well as demonstrate simulated algorithmic use cases in digit recognition. Transition metal oxides with carefully prepared defect concentrations can be highly sensitive to electronic structure perturbations under moderate electrical stimulus and serve as building blocks for next‐generation neuroinspired computing hardware. -
How animals respond to repeatedly applied stimuli, and how animals respond to mechanical stimuli in particular, are important questions in behavioral neuroscience. We study adaptation to repeated mechanical agitation using the
Drosophila larva. Vertical vibration stimuli elicit a discrete set of responses in crawling larvae: continuation, pause, turn, and reversal. Through high-throughput larva tracking, we characterize how the likelihood of each response depends on vibration intensity and on the timing of repeated vibration pulses. By examining transitions between behavioral states at the population and individual levels, we investigate how the animals habituate to the stimulus patterns. We identify time constants associated with desensitization to prolonged vibration, with re-sensitization during removal of a stimulus, and additional layers of habituation that operate in the overall response. Known memory-deficient mutants exhibit distinct behavior profiles and habituation time constants. An analogous simple electrical circuit suggests possible neural and molecular processes behind adaptive behavior. -
Point defects, such as oxygen vacancies, control the physical properties of complex oxides, relevant in active areas of research from superconductivity to resistive memory to catalysis. In most oxide semiconductors, electrons that are associated with oxygen vacancies occupy the conduction band, leading to an increase in the electrical conductivity. Here we demonstrate, in contrast, that in the correlated-electron perovskite rare-earth nickelates, R NiO 3 ( R is a rare-earth element such as Sm or Nd), electrons associated with oxygen vacancies strongly localize, leading to a dramatic decrease in the electrical conductivity by several orders of magnitude. This unusual behavior is found to stem from the combination of crystal field splitting and filling-controlled Mott–Hubbard electron–electron correlations in the Ni 3 d orbitals. Furthermore, we show the distribution of oxygen vacancies in NdNiO 3 can be controlled via an electric field, leading to analog resistance switching behavior. This study demonstrates the potential of nickelates as testbeds to better understand emergent physics in oxide heterostructures as well as candidate systems in the emerging fields of artificial intelligence.more » « less
-
Rainey, Larry B. ; Holland, O. Thomas (Ed.)Biological neural networks offer some of the most striking and complex examples of emergence ever observed in natural or man-made systems. Individually, the behavior of a single neuron is rather simple, yet these basic building blocks are connected through synapses to form neural networks, which are capable of sophisticated capabilities such as pattern recognition and navigation. Lower-level functionality provided by a given network is combined with other networks to produce more sophisticated capabilities. These capabilities manifest emergently at two vastly different, yet interconnected time scales. At the time scale of neural dynamics, neural networks are responsible for turning noisy external stimuli and internal signals into signals capable of supporting complex computations. A key component in this process is the structure of the network, which itself forms emergently over much longer time scales based on the outputs of its constituent neurons, a process called learning. The analysis and interpretation of the behaviors of these interconnected dynamical systems of neurons should account for the network structure and the collective behavior of the network. The field of graph signal processing (GSP) combines signal processing with network science to study signals defined on irregular network structures. Here, we show that GSP can be a valuable tool in the analysis of emergence in biological neural networks. Beyond any purely scientific pursuits, understanding the emergence in biological neural networks directly impacts the design of more effective artificial neural networks for general machine learning and artificial intelligence tasks across domains, and motivates additional design motifs for novel emergent systems of systems.more » « less
-
Abstract Adaptive ‘life-long’ learning at the edge and during online task performance is an aspirational goal of artificial intelligence research. Neuromorphic hardware implementing spiking neural networks (SNNs) are particularly attractive in this regard, as their real-time, event-based, local computing paradigm makes them suitable for edge implementations and fast learning. However, the long and iterative learning that characterizes state-of-the-art SNN training is incompatible with the physical nature and real-time operation of neuromorphic hardware. Bi-level learning, such as meta-learning is increasingly used in deep learning to overcome these limitations. In this work, we demonstrate gradient-based meta-learning in SNNs using the surrogate gradient method that approximates the spiking threshold function for gradient estimations. Because surrogate gradients can be made twice differentiable, well-established, and effective second-order gradient meta-learning methods such as model agnostic meta learning (MAML) can be used. We show that SNNs meta-trained using MAML perform comparably to conventional artificial neural networks meta-trained with MAML on event-based meta-datasets. Furthermore, we demonstrate the specific advantages that accrue from meta-learning: fast learning without the requirement of high precision weights or gradients, training-to-learn with quantization and mitigating the effects of approximate synaptic plasticity rules. Our results emphasize how meta-learning techniques can become instrumental for deploying neuromorphic learning technologies on real-world problems.