Bayesian Online Meta-Learning with Laplace Approximation

04/30/2020
by   Pau Ching Yap, et al.
0

Neural networks are known to suffer from catastrophic forgetting when trained on sequential datasets. While there have been numerous attempts to solve this problem for large-scale supervised classification, little has been done to overcome catastrophic forgetting for few-shot classification problems. We demonstrate that the popular gradient-based few-shot meta-learning algorithm Model-Agnostic Meta-Learning (MAML) indeed suffers from catastrophic forgetting and introduce a Bayesian online meta-learning framework that tackles this problem. Our framework incorporates MAML into a Bayesian online learning algorithm with Laplace approximation. This framework enables few-shot classification on a range of sequentially arriving datasets with a single meta-learned model. The experimental evaluations demonstrate that our framework can effectively prevent forgetting in various few-shot classification settings compared to applying MAML sequentially.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2019

Meta-learnt priors slow down catastrophic forgetting in neural networks

Current training regimes for deep learning usually involve exposure to a...
research
05/20/2018

Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting

We introduce the Kronecker factored online Laplace approximation for ove...
research
12/08/2020

Continual Adaptation of Visual Representations via Domain Randomization and Meta-learning

Most standard learning approaches lead to fragile models which are prone...
research
09/16/2019

AdaBoost-assisted Extreme Learning Machine for Efficient Online Sequential Classification

In this paper, we propose an AdaBoost-assisted extreme learning machine ...
research
05/22/2023

Mitigating Catastrophic Forgetting for Few-Shot Spoken Word Classification Through Meta-Learning

We consider the problem of few-shot spoken word classification in a sett...
research
10/10/2019

Learning to Remember from a Multi-Task Teacher

Recent studies on catastrophic forgetting during sequential learning typ...
research
05/15/2018

Continuous Learning in a Hierarchical Multiscale Neural Network

We reformulate the problem of encoding a multi-scale representation of a...

Please sign up or login with your details

Forgot password? Click here to reset