Memory-Based Label-Text Tuning for Few-Shot Class-Incremental Learning

07/03/2022
by   Jinze Li, et al.
0

Few-shot class-incremental learning(FSCIL) focuses on designing learning algorithms that can continually learn a sequence of new tasks from a few samples without forgetting old ones. The difficulties are that training on a sequence of limited data from new tasks leads to severe overfitting issues and causes the well-known catastrophic forgetting problem. Existing researches mainly utilize the image information, such as storing the image knowledge of previous tasks or limiting classifiers updating. However, they ignore analyzing the informative and less noisy text information of class labels. In this work, we propose leveraging the label-text information by adopting the memory prompt. The memory prompt can learn new data sequentially, and meanwhile store the previous knowledge. Furthermore, to optimize the memory prompt without undermining the stored knowledge, we propose a stimulation-based training strategy. It optimizes the memory prompt depending on the image embedding stimulation, which is the distribution of the image embedding elements. Experiments show that our proposed method outperforms all prior state-of-the-art approaches, significantly mitigating the catastrophic forgetting and overfitting problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2023

Two-level Graph Network for Few-Shot Class-Incremental Learning

Few-shot class-incremental learning (FSCIL) aims to design machine learn...
research
06/12/2021

Knowledge Consolidation based Class Incremental Online Learning with Limited Data

We propose a novel approach for class incremental online learning in a l...
research
04/07/2021

Few-Shot Incremental Learning with Continually Evolved Classifiers

Few-shot class-incremental learning (FSCIL) aims to design machine learn...
research
05/30/2019

A Hippocampus Model for Online One-Shot Storage of Pattern Sequences

We present a computational model based on the CRISP theory (Content Repr...
research
12/16/2022

MEIL-NeRF: Memory-Efficient Incremental Learning of Neural Radiance Fields

Hinged on the representation power of neural networks, neural radiance f...
research
03/29/2021

ClaRe: Practical Class Incremental Learning By Remembering Previous Class Representations

This paper presents a practical and simple yet efficient method to effec...
research
08/04/2020

Memory Efficient Class-Incremental Learning for Image Classification

With the memory-resource-limited constraints, class-incremental learning...

Please sign up or login with your details

Forgot password? Click here to reset