Continual Learning by Modeling Intra-Class Variation

10/11/2022
by   Longhui Yu, et al.
0

It has been observed that neural networks perform poorly when the data or tasks are presented sequentially. Unlike humans, neural networks suffer greatly from catastrophic forgetting, making it impossible to perform life-long learning. To address this issue, memory-based continual learning has been actively studied and stands out as one of the best-performing methods. We examine memory-based continual learning and identify that large variation in the representation space is crucial for avoiding catastrophic forgetting. Motivated by this, we propose to diversify representations by using two types of perturbations: model-agnostic variation (i.e., the variation is generated without the knowledge of the learned neural network) and model-based variation (i.e., the variation is conditioned on the learned neural network). We demonstrate that enlarging representational variation serves as a general principle to improve continual learning. Finally, we perform empirical studies which demonstrate that our method, as a simple plug-and-play component, can consistently improve a number of memory-based continual learning methods by a large margin.

READ FULL TEXT

page 1

page 2

page 6

page 10

page 12

research
12/03/2018

Few-Shot Self Reminder to Overcome Catastrophic Forgetting

Deep neural networks are known to suffer the catastrophic forgetting pro...
research
02/09/2022

A Neural Network Model of Continual Learning with Cognitive Control

Neural networks struggle in continual learning settings from catastrophi...
research
06/03/2019

Continual learning with hypernetworks

Artificial neural networks suffer from catastrophic forgetting when they...
research
10/06/2022

Topological Continual Learning with Wasserstein Distance and Barycenter

Continual learning in neural networks suffers from a phenomenon called c...
research
08/16/2023

Advancing continual lifelong learning in neural information retrieval: definition, dataset, framework, and empirical evaluation

Continual learning refers to the capability of a machine learning model ...
research
11/23/2022

Cooperative data-driven modeling

Data-driven modeling in mechanics is evolving rapidly based on recent ma...
research
09/16/2020

Measuring Information Transfer in Neural Networks

Estimation of the information content in a neural network model can be p...

Please sign up or login with your details

Forgot password? Click here to reset