Continual Learning from the Perspective of Compression

06/26/2020
by   Xu He, et al.
0

Connectionist models such as neural networks suffer from catastrophic forgetting. In this work, we study this problem from the perspective of information theory and define forgetting as the increase of description lengths of previous data when they are compressed with a sequentially learned model. In addition, we show that continual learning approaches based on variational posterior approximation and generative replay can be considered as approximations to two prequential coding methods in compression, namely, the Bayesian mixture code and maximum likelihood (ML) plug-in code. We compare these approaches in terms of both compression and forgetting and empirically study the reasons that limit the performance of continual learning methods based on variational posterior approximation. To address these limitations, we propose a new continual learning method that combines ML plug-in and Bayesian mixture codes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

A multifidelity approach to continual learning for physical systems

We introduce a novel continual learning method based on multifidelity de...
research
10/25/2021

Mixture-of-Variational-Experts for Continual Learning

One significant shortcoming of machine learning is the poor ability of m...
research
08/22/2023

Variational Density Propagation Continual Learning

Deep Neural Networks (DNNs) deployed to the real world are regularly sub...
research
11/19/2019

Online Learned Continual Compression with Stacked Quantization Module

We introduce and study the problem of Online Continual Compression, wher...
research
10/04/2020

Remembering for the Right Reasons: Explanations Reduce Catastrophic Forgetting

The goal of continual learning (CL) is to learn a sequence of tasks with...
research
06/10/2019

Psycholinguistics meets Continual Learning: Measuring Catastrophic Forgetting in Visual Question Answering

We study the issue of catastrophic forgetting in the context of neural m...
research
11/14/2022

Hierarchically Structured Task-Agnostic Continual Learning

One notable weakness of current machine learning algorithms is the poor ...

Please sign up or login with your details

Forgot password? Click here to reset