On Sequential Bayesian Inference for Continual Learning

01/04/2023
by   Samuel Kessler, et al.
0

Sequential Bayesian inference can be used for continual learning to prevent catastrophic forgetting of past tasks and provide an informative prior when learning new tasks. We revisit sequential Bayesian inference and test whether having access to the true posterior is guaranteed to prevent catastrophic forgetting in Bayesian neural networks. To do this we perform sequential Bayesian inference using Hamiltonian Monte Carlo. We propagate the posterior as a prior for new tasks by fitting a density estimator on Hamiltonian Monte Carlo samples. We find that this approach fails to prevent catastrophic forgetting demonstrating the difficulty in performing sequential Bayesian inference in neural networks. From there we study simple analytical examples of sequential Bayesian inference and CL and highlight the issue of model misspecification which can lead to sub-optimal continual learning performance despite exact inference. Furthermore, we discuss how task data imbalances can cause forgetting. From these limitations, we argue that we need probabilistic models of the continual learning generative process rather than relying on sequential Bayesian inference over Bayesian neural network weights. In this vein, we also propose a simple baseline called Prototypical Bayesian Continual Learning, which is competitive with state-of-the-art Bayesian continual learning methods on class incremental continual learning vision benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2017

Variational Continual Learning

This paper develops variational continual learning (VCL), a simple but g...
research
12/03/2019

Overcoming Catastrophic Forgetting by Generative Regularization

In this paper, we propose a new method to overcome catastrophic forgetti...
research
08/22/2023

Variational Density Propagation Continual Learning

Deep Neural Networks (DNNs) deployed to the real world are regularly sub...
research
05/28/2019

Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition

We introduce a unified probabilistic approach for deep continual learnin...
research
04/21/2020

Bayesian Nonparametric Weight Factorization for Continual Learning

Naively trained neural networks tend to experience catastrophic forgetti...
research
01/16/2021

Bayesian Inference Forgetting

The right to be forgotten has been legislated in many countries but the ...
research
02/25/2020

Training Binary Neural Networks using the Bayesian Learning Rule

Neural networks with binary weights are computation-efficient and hardwa...

Please sign up or login with your details

Forgot password? Click here to reset