DeepAI AI Chat
Log In Sign Up

Energy-Based Models for Continual Learning

by   Shuang Li, et al.

We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. We find that EBMs outperform the baseline methods by a large margin on several continual learning benchmarks. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.


page 8

page 15


Understanding Continual Learning Settings with Data Distribution Drift Analysis

Classical machine learning algorithms often assume that the data are dra...

Target Layer Regularization for Continual Learning Using Cramer-Wold Generator

We propose an effective regularization strategy (CW-TaLaR) for solving c...

Continual Learning in Low-rank Orthogonal Subspaces

In continual learning (CL), a learner is faced with a sequence of tasks,...

A Simple Baseline that Questions the Use of Pretrained-Models in Continual Learning

With the success of pretraining techniques in representation learning, a...

Encoders and Ensembles for Task-Free Continual Learning

We present an architecture that is effective for continual learning in a...

Continual Learning with Fully Probabilistic Models

We present an approach for continual learning (CL) that is based on full...

CoSCL: Cooperation of Small Continual Learners is Stronger than a Big One

Continual learning requires incremental compatibility with a sequence of...