Conditional Computation for Continual Learning

06/16/2019
by   Min Lin, et al.
2

Catastrophic forgetting of connectionist neural networks is caused by the global sharing of parameters among all training examples. In this study, we analyze parameter sharing under the conditional computation framework where the parameters of a neural network are conditioned on each input example. At one extreme, if each input example uses a disjoint set of parameters, there is no sharing of parameters thus no catastrophic forgetting. At the other extreme, if the parameters are the same for every example, it reduces to the conventional neural network. We then introduce a clipped version of maxout networks which lies in the middle, i.e. parameters are shared partially among examples. Based on the parameter sharing analysis, we can locate a limited set of examples that are interfered when learning a new example. We propose to perform rehearsal on this set to prevent forgetting, which is termed as conditional rehearsal. Finally, we demonstrate the effectiveness of the proposed method in an online non-stationary setup, where updates are made after each new example and the distribution of the received example shifts over time.

READ FULL TEXT

page 1

page 2

page 4

page 5

page 7

page 8

research
06/11/2018

Meta Continual Learning

Using neural networks in practical settings would benefit from the abili...
research
01/02/2023

Dynamically Modular and Sparse General Continual Learning

Real-world applications often require learning continuously from a strea...
research
05/16/2021

Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks

When a computational system continuously learns from an ever-changing en...
research
12/10/2020

Overcoming Catastrophic Forgetting in Graph Neural Networks

Catastrophic forgetting refers to the tendency that a neural network "fo...
research
05/03/2020

Explaining How Deep Neural Networks Forget by Deep Visualization

Explaining the behaviors of deep neural networks, usually considered as ...
research
12/12/2018

An Empirical Study of Example Forgetting during Deep Neural Network Learning

Inspired by the phenomenon of catastrophic forgetting, we investigate th...
research
05/19/2023

Conditional Online Learning for Keyword Spotting

Modern approaches for keyword spotting rely on training deep neural netw...

Please sign up or login with your details

Forgot password? Click here to reset