Uncertainty-guided Continual Learning with Bayesian Neural Networks

06/06/2019
by   Sayna Ebrahimi, et al.
0

Continual learning aims to learn new tasks without forgetting previously learned ones. This is especially challenging when one cannot access data from previous tasks and when the model has a fixed capacity. Current regularization-based continual learning algorithms need an external representation and extra computation to measure the parameters' importance. In contrast, we propose Uncertainty-guided Continual Bayesian Neural Networks (UCB), where the learning rate adapts according to the uncertainty defined in the probability distribution of the weights in networks. Uncertainty is a natural way to identify what to remember and what to change as we continually learn, allowing to mitigate catastrophic forgetting. We also show a variant of our model, which uses uncertainty for weight pruning and retains task performance after pruning by saving binary masks per tasks. We evaluate our UCB approach extensively on diverse object classification datasets with short and long sequences of tasks and report superior or on-par performance compared to existing approaches. Additionally, we show that our model does not necessarily need task information at test time, i.e. it does not presume knowledge of which task a sample belongs to.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2019

Continual Learning Using Bayesian Neural Networks

Continual learning models allow to learn and adapt to new changes and ta...
research
02/25/2020

Training Binary Neural Networks using the Bayesian Learning Rule

Neural networks with binary weights are computation-efficient and hardwa...
research
05/28/2019

Uncertainty-based Continual Learning with Adaptive Regularization

We introduce a new regularization-based continual learning algorithm, du...
research
07/20/2023

Self-paced Weight Consolidation for Continual Learning

Continual learning algorithms which keep the parameters of new tasks clo...
research
03/31/2020

Conditional Channel Gated Networks for Task-Aware Continual Learning

Convolutional Neural Networks experience catastrophic forgetting when op...
research
10/28/2020

A Study on Efficiency in Continual Learning Inspired by Human Learning

Humans are efficient continual learning systems; we continually learn ne...
research
07/15/2022

Continual Learning For On-Device Environmental Sound Classification

Continuously learning new classes without catastrophic forgetting is a c...

Please sign up or login with your details

Forgot password? Click here to reset