Neuroscience-inspired AI

Neuroscience-Inspired Artificial Intelligence

Developing next-generation learning algorithms that draw inspiration from biological nervous systems to revolutionize intelligent systems

Our Contribution

Researchers in APL’s Intelligent Systems Center (ISC) are extracting and translating design principles of neural connectivity and function from multiple species to create the next generation of robust, efficient intelligent systems operating in the real world.


Novel Neural Network Architectures Informed by Connectomes

Biological and artificial neural networks
Biological and artificial neural networks. Cell morphology from the heading direction circuit in the central complex of the fruit fly (left); realization of this biological circuit as an artificial recurrent neural network (right).

Connectomes—snapshots of brain connectivity, extrapolated from detailed maps of neural connections in biological brains—can inform novel artificial recurrent neural network architectures, as demonstrated by the ISC’s work modeling the heading direction network of the fruit fly. ISC researchers have successfully tested fruit-fly-inspired algorithms on robotic platforms.

The ISC plays a key role hosting large-scale neuroscience datasets for the community, focusing in particular on nanoscale connectomes. Teams from the ISC also develop open-source analysis tools for extracting patterns from these datasets, such as repeated structural patterns, or motifs. In addition to extracting insight from the neuron-synapse connectome of fruit flies, ISC researchers extrapolate insights about neural connectivity from a range of species—including humans—and scale these insights to improve network robustness and performance.

Brian Robinson discusses “Informing generative replay for continual learning with long-term memory formation in the fruit fly.”

Related Publications

Neuroscience-Inspired Learning for Artificial Neural Networks

Artificial sleep in neural networks
Artificial sleep in neural networks. By replicating the cellular processes of sleep in an artificial neural network (A), we were able to decrease “catastrophic forgetting” of image labels learned in the distant past while preserving the ability to learn new information (B, C).

Building on the ISC’s expertise in sleep research, researchers at APL are translating the benefits of biological sleep—ubiquitous across intelligent organisms—to artificial neural networks. This has led to improvements to approaches to overcome catastrophic forgetting, which is critical to enabling improved continual learning in classification networks. APL has also developed an approach for efficient generative replay for continual learning based on fruit-fly memory consolidation, and is looking into replay mechanisms for acquisition of semantic knowledge through associative learning. Additionally, APL has built complex, robust, and adaptive swarming robotic controllers based on neural dynamics.

Related Publications

  • Robinson, B. S., C. W. Lau, A. New, S. M. Nichols, E. C. Johnson, M. Wolmetz, W. G. Coon, “Continual learning benefits from multiple sleep stages: NREM, REM, and Synaptic Downscaling, 2022 International Joint Conference on Neural Networks (IJCNN), pp. 1–9, IEEE (July 2022).
  • Robinson, B., A. Polevoy, S. McDaniel, W. Coon, C. Scholl, M. McLean, E. Johnson, “A spiking network model for semantic representation and replay-based association acquisition,” International Conference on Neuromorphic Systems 2021, pp. 1–8 (July 2021).
  • Robinson, Brian S., Justin Joyce, Raphael Norman-Tenazas, Gautam K. Vallabha, Erik C. Johnson, “Informing Generative Replay for Continual Learning with Long-Term Memory Formation in the Fruit Fly,” NeurIPS MemARI Workshop 2022 (2023).
  • Monaco, J. D., G. M. Hwang, K. M. Schultz, and K. Zhang, “Cognitive swarming in complex environments with attractor dynamics and oscillatory computing,” Biological Cybernetics, vol. 114, no. 2, pp. 269–284 (2020).