Continual learning benefits from multiple sleep mechanisms: NREM, REM, and Synaptic Downscaling

09/09/2022
by   Brian S. Robinson, et al.
25

Learning new tasks and skills in succession without losing prior learning (i.e., catastrophic forgetting) is a computational challenge for both artificial and biological neural networks, yet artificial systems struggle to achieve parity with their biological analogues. Mammalian brains employ numerous neural operations in support of continual learning during sleep. These are ripe for artificial adaptation. Here, we investigate how modeling three distinct components of mammalian sleep together affects continual learning in artificial neural networks: (1) a veridical memory replay process observed during non-rapid eye movement (NREM) sleep; (2) a generative memory replay process linked to REM sleep; and (3) a synaptic downscaling process which has been proposed to tune signal-to-noise ratios and support neural upkeep. We find benefits from the inclusion of all three sleep components when evaluating performance on a continual learning CIFAR-100 image classification benchmark. Maximum accuracy improved during training and catastrophic forgetting was reduced during later tasks. While some catastrophic forgetting persisted over the course of network training, higher levels of synaptic downscaling lead to better retention of early tasks and further facilitated the recovery of early task accuracy during subsequent training. One key takeaway is that there is a trade-off at hand when considering the level of synaptic downscaling to use - more aggressive downscaling better protects early tasks, but less downscaling enhances the ability to learn new tasks. Intermediate levels can strike a balance with the highest overall accuracies during training. Overall, our results both provide insight into how to adapt sleep components to enhance artificial continual learning systems and highlight areas for future neuroscientific sleep research to further such systems.

READ FULL TEXT

page 1

page 2

page 6

page 7

research
02/20/2018

Continual Reinforcement Learning with Complex Synapses

Unlike humans, who are capable of continual learning over their lifetime...
research
11/13/2020

Continual Learning with Deep Artificial Neurons

Neurons in real brains are enormously complex computational units. Among...
research
10/28/2020

A Study on Efficiency in Continual Learning Inspired by Human Learning

Humans are efficient continual learning systems; we continually learn ne...
research
03/13/2017

Continual Learning Through Synaptic Intelligence

While deep learning has led to remarkable advances across diverse applic...
research
08/26/2021

Continual learning under domain transfer with sparse synaptic bursting

Existing machines are functionally specific tools that were made for eas...
research
11/13/2022

NREM and REM: cognitive and energetic gains in thalamo-cortical sleeping and awake spiking model

Sleep is essential for learning and cognition, but the mechanisms by whi...
research
06/08/2022

SYNERgy between SYNaptic consolidation and Experience Replay for general continual learning

Continual learning (CL) in the brain is facilitated by a complex set of ...

Please sign up or login with your details

Forgot password? Click here to reset