Beneficial perturbation network for continual learning

06/22/2019
by   Shixian Wen, et al.
0

Sequential learning of multiple tasks in artificial neural networks using gradient descent leads to catastrophic forgetting, whereby previously learned knowledge is erased during learning of new, disjoint knowledge. Here, we propose a fundamentally new type of method - Beneficial Perturbation Network (BPN). We add task-dependent memory (biasing) units to allow the network to operate in different regimes for different tasks. We compute the most beneficial directions to train these units, in a manner inspired by recent work on adversarial examples. At test time, beneficial perturbations for a given task bias the network toward that task to overcome catastrophic forgetting. BPN is not only more parameter-efficient than network expansion methods, but also does not need to store any data from previous tasks, in contrast with episodic memory methods. Experiments on variants of the MNIST, CIFAR-10, CIFAR-100 datasets demonstrate strong performance of BPN when compared to the state-of-the-art.

READ FULL TEXT
research
09/27/2020

Beneficial Perturbation Network for designing general adaptive artificial intelligence systems

The human brain is the gold standard of adaptive learning. It not only c...
research
05/18/2018

Overcoming catastrophic forgetting problem by weight consolidation and long-term memory

Sequential learning of multiple tasks in artificial neural networks usin...
research
01/23/2020

Structured Compression and Sharing of Representational Space for Continual Learning

Humans are skilled at learning adaptively and efficiently throughout the...
research
11/27/2017

Memory Aware Synapses: Learning what (not) to forget

Humans can learn in a continuous manner. Old rarely utilized knowledge c...
research
03/19/2020

Lifelong Learning with Searchable Extension Units

Lifelong learning remains an open problem. One of its main difficulties ...
research
02/08/2018

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

In this paper we propose an approach to avoiding catastrophic forgetting...
research
02/12/2018

Pseudo-Recursal: Solving the Catastrophic Forgetting Problem in Deep Neural Networks

In general, neural networks are not currently capable of learning tasks ...

Please sign up or login with your details

Forgot password? Click here to reset