Meta-learnt priors slow down catastrophic forgetting in neural networks

09/09/2019
by   Giacomo Spigler, et al.
1

Current training regimes for deep learning usually involve exposure to a single task / dataset at a time. Here we start from the observation that in this context the trained model is not given any knowledge of anything outside its (single) training distribution, and has thus no way to learn parameters (i.e., feature detectors or policies) that could be helpful to solve other tasks, and to limit future interference on the acquired knowledge, and thus catastrophic forgetting. Here we show that catastrophic forgetting can be mitigated in a meta-learning context, by exposing a neural network to multiple tasks in a sequential manner during training. Finally, we present SeqFOMAML, a meta-learning algorithm that implements these principles, and we evaluate it on a sequential learning problem composed by Omniglot classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

Bayesian Online Meta-Learning with Laplace Approximation

Neural networks are known to suffer from catastrophic forgetting when tr...
research
10/10/2019

Learning to Remember from a Multi-Task Teacher

Recent studies on catastrophic forgetting during sequential learning typ...
research
12/08/2020

Continual Adaptation of Visual Representations via Domain Randomization and Meta-learning

Most standard learning approaches lead to fragile models which are prone...
research
08/09/2021

Some thoughts on catastrophic forgetting and how to learn an algorithm

The work of McCloskey and Cohen popularized the concept of catastrophic ...
research
06/07/2023

Meta-Learning in Spiking Neural Networks with Reward-Modulated STDP

The human brain constantly learns and rapidly adapts to new situations b...
research
04/12/2018

Combating catastrophic forgetting with developmental compression

Generally intelligent agents exhibit successful behavior across problems...
research
01/07/2020

Frosting Weights for Better Continual Training

Training a neural network model can be a lifelong learning process and i...

Please sign up or login with your details

Forgot password? Click here to reset