Generative Kernel Continual learning

Kernel continual learning by <cit.> has recently emerged as a strong continual learner due to its non-parametric ability to tackle task interference and catastrophic forgetting. Unfortunately its success comes at the expense of an explicit memory to store samples from past tasks, which hampers scalability to continual learning settings with a large number of tasks. In this paper, we introduce generative kernel continual learning, which explores and exploits the synergies between generative models and kernels for continual learning. The generative model is able to produce representative samples for kernel learning, which removes the dependence on memory in kernel continual learning. Moreover, as we replay only on the generative model, we avoid task interference while being computationally more efficient compared to previous methods that need replay on the entire model. We further introduce a supervised contrastive regularization, which enables our model to generate even more discriminative samples for better kernel-based classification performance. We conduct extensive experiments on three widely-used continual learning benchmarks that demonstrate the abilities and benefits of our contributions. Most notably, on the challenging SplitCIFAR100 benchmark, with just a simple linear kernel we obtain the same accuracy as kernel continual learning with variational random features for one tenth of the memory, or a 10.1% accuracy gain for the same memory budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2021

Kernel Continual Learning

This paper introduces kernel continual learning, a simple but effective ...
research
12/21/2018

Generative Models from the perspective of Continual Learning

Which generative model is the most suitable for Continual Learning? This...
research
01/17/2022

Logarithmic Continual Learning

We introduce a neural network architecture that logarithmically reduces ...
research
03/02/2023

Semiparametric Language Models Are Scalable Continual Learners

Semiparametric language models (LMs) have shown promise in continuously ...
research
03/20/2023

Sparse Distributed Memory is a Continual Learner

Continual learning is a problem for artificial neural networks that thei...
research
05/28/2023

Just a Glimpse: Rethinking Temporal Information for Video Continual Learning

Class-incremental learning is one of the most important settings for the...
research
06/16/2023

Studying Generalization on Memory-Based Methods in Continual Learning

One of the objectives of Continual Learning is to learn new concepts con...

Please sign up or login with your details

Forgot password? Click here to reset