Improving Performance in Continual Learning Tasks using Bio-Inspired Architectures

08/08/2023
by   Sandeep Madireddy, et al.
0

The ability to learn continuously from an incoming data stream without catastrophic forgetting is critical to designing intelligent systems. Many approaches to continual learning rely on stochastic gradient descent and its variants that employ global error updates, and hence need to adopt strategies such as memory buffers or replay to circumvent its stability, greed, and short-term memory limitations. To address this limitation, we have developed a biologically inspired lightweight neural network architecture that incorporates synaptic plasticity mechanisms and neuromodulation and hence learns through local error signals to enable online continual learning without stochastic gradient descent. Our approach leads to superior online continual learning performance on Split-MNIST, Split-CIFAR-10, and Split-CIFAR-100 datasets compared to other memory-constrained learning approaches and matches that of the state-of-the-art memory-intensive replay-based approaches. We further demonstrate the effectiveness of our approach by integrating key design concepts into other backpropagation-based continual learning algorithms, significantly improving their accuracy. Our results provide compelling evidence for the importance of incorporating biological principles into machine learning models and offer insights into how we can leverage them to design more efficient and robust systems for online continual learning.

READ FULL TEXT
research
07/16/2020

Multilayer Neuromodulated Architectures for Memory-Constrained Online Continual Learning

We focus on the problem of how to achieve online continual learning unde...
research
06/24/2021

Continual Competitive Memory: A Neural System for Online Task-Free Lifelong Learning

In this article, we propose a novel form of unsupervised learning, conti...
research
01/29/2022

Continual Learning with Recursive Gradient Optimization

Learning multiple tasks sequentially without forgetting previous knowled...
research
06/24/2018

Beyond Backprop: Alternating Minimization with co-Activation Memory

We propose a novel online algorithm for training deep feedforward neural...
research
08/11/2023

Cost-effective On-device Continual Learning over Memory Hierarchy with Miro

Continual learning (CL) trains NN models incrementally from a continuous...
research
02/14/2022

Continual Learning from Demonstration of Robotic Skills

Methods for teaching motion skills to robots focus on training for a sin...
research
03/17/2021

Gradient Projection Memory for Continual Learning

The ability to learn continually without forgetting the past tasks is a ...

Please sign up or login with your details

Forgot password? Click here to reset