DeepAI AI Chat
Log In Sign Up

Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without Forgetting

08/18/2021
by   Anna Kukleva, et al.
0

Both generalized and incremental few-shot learning have to deal with three major challenges: learning novel classes from only few samples per class, preventing catastrophic forgetting of base classes, and classifier calibration across novel and base classes. In this work we propose a three-stage framework that allows to explicitly and effectively address these challenges. While the first phase learns base classes with many samples, the second phase learns a calibrated classifier for novel classes from few samples while also preventing catastrophic forgetting. In the final phase, calibration is achieved across all classes. We evaluate the proposed framework on four challenging benchmark datasets for image and video few-shot classification and obtain state-of-the-art results for both generalized and incremental few shot learning.

READ FULL TEXT
07/05/2023

S3C: Self-Supervised Stochastic Classifiers for Few-Shot Class-Incremental Learning

Few-shot class-incremental learning (FSCIL) aims to learn progressively ...
10/01/2022

Learnable Distribution Calibration for Few-Shot Class-Incremental Learning

Few-shot class-incremental learning (FSCIL) faces challenges of memorizi...
06/18/2022

Demystifying the Base and Novel Performances for Few-shot Class-incremental Learning

Few-shot class-incremental learning (FSCIL) has addressed challenging re...
10/27/2018

Low-shot Learning via Covariance-Preserving Adversarial Augmentation Networks

Deep neural networks suffer from over-fitting and catastrophic forgettin...
05/30/2022

Few-shot Class-incremental Learning for 3D Point Cloud Objects

Few-shot class-incremental learning (FSCIL) aims to incrementally fine-t...
02/06/2023

Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class Incremental Learning

Few-shot class-incremental learning (FSCIL) has been a challenging probl...