Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition

11/09/2021
by   Kai Wang, et al.
0

Most meta-learning approaches assume the existence of a very large set of labeled data available for episodic meta-learning of base knowledge. This contrasts with the more realistic continual learning paradigm in which data arrives incrementally in the form of tasks containing disjoint classes. In this paper we consider this problem of Incremental Meta-Learning (IML) in which classes are presented incrementally in discrete tasks. We propose an approach to IML, which we call Episodic Replay Distillation (ERD), that mixes classes from the current task with class exemplars from previous tasks when sampling episodes for meta-learning. These episodes are then used for knowledge distillation to minimize catastrophic forgetting. Experiments on four datasets demonstrate that ERD surpasses the state-of-the-art. In particular, on the more challenging one-shot, long task sequence incremental meta-learning scenarios, we reduce the gap between IML and the joint-training upper bound from 3.5 10.1 method on Tiered-ImageNet / Mini-ImageNet / CIFAR100, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2020

Incremental Object Detection via Meta-Learning

In a real-world setting, object instances from new classes may be contin...
research
09/12/2022

Online Continual Learning via the Meta-learning Update with Multi-scale Knowledge Distillation and Data Augmentation

Continual learning aims to rapidly and continually learn the current tas...
research
06/12/2021

Knowledge Consolidation based Class Incremental Online Learning with Limited Data

We propose a novel approach for class incremental online learning in a l...
research
05/22/2023

Mitigating Catastrophic Forgetting for Few-Shot Spoken Word Classification Through Meta-Learning

We consider the problem of few-shot spoken word classification in a sett...
research
09/10/2020

Meta-Learning with Sparse Experience Replay for Lifelong Language Learning

Lifelong learning requires models that can continuously learn from seque...
research
11/06/2020

Confusable Learning for Large-class Few-Shot Classification

Few-shot image classification is challenging due to the lack of ample sa...
research
06/01/2022

Dataset Distillation using Neural Feature Regression

Dataset distillation aims to learn a small synthetic dataset that preser...

Please sign up or login with your details

Forgot password? Click here to reset