Lifelong Neural Predictive Coding: Sparsity Yields Less Forgetting when Learning Cumulatively

05/25/2019
by   Alexander Ororbia, et al.
7

In lifelong learning systems, especially those based on artificial neural networks, one of the biggest obstacles is the severe inability to retain old knowledge as new information is encountered. This phenomenon is known as catastrophic forgetting. In this paper, we present a new connectionist model, the Sequential Neural Coding Network, and its learning procedure, grounded in the neurocognitive theory of predictive coding. The architecture experiences significantly less forgetting as compared to standard neural models and outperforms a variety of previously proposed remedies and methods when trained across multiple task datasets in a stream-like fashion. The promising performance demonstrated in our experiments offers motivation that directly incorporating mechanisms prominent in real neuronal systems, such as competition, sparse activation patterns, and iterative input processing, can create viable pathways for tackling the challenge of lifelong machine learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/07/2017

Measuring Catastrophic Forgetting in Neural Networks

Deep neural networks are used in many state-of-the-art systems for machi...
research
02/22/2018

Overcoming Catastrophic Forgetting in Convolutional Neural Networks by Selective Network Augmentation

Lifelong learning aims to develop machine learning systems that can lear...
research
02/15/2021

Does Standard Backpropagation Forget Less Catastrophically Than Adam?

Catastrophic forgetting remains a severe hindrance to the broad applicat...
research
12/21/2013

An Empirical Investigation of Catastrophic Forgetting in Gradient-Based Neural Networks

Catastrophic forgetting is a problem faced by many machine learning mode...
research
11/21/2021

Learning by Active Forgetting for Neural Networks

Remembering and forgetting mechanisms are two sides of the same coin in ...
research
02/01/2022

Fortuitous Forgetting in Connectionist Networks

Forgetting is often seen as an unwanted characteristic in both human and...
research
01/07/2020

Intrinsic Motivation and Episodic Memories for Robot Exploration of High-Dimensional Sensory Spaces

This work presents an architecture that generates curiosity-driven goal-...

Please sign up or login with your details

Forgot password? Click here to reset