Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class Incremental Learning

02/06/2023
by   Yibo Yang, et al.
0

Few-shot class-incremental learning (FSCIL) has been a challenging problem as only a few training samples are accessible for each novel class in the new sessions. Finetuning the backbone or adjusting the classifier prototypes trained in the prior sessions would inevitably cause a misalignment between the feature and classifier of old classes, which explains the well-known catastrophic forgetting problem. In this paper, we deal with this misalignment dilemma in FSCIL inspired by the recently discovered phenomenon named neural collapse, which reveals that the last-layer features of the same class will collapse into a vertex, and the vertices of all classes are aligned with the classifier prototypes, which are formed as a simplex equiangular tight frame (ETF). It corresponds to an optimal geometric structure for classification due to the maximized Fisher Discriminant Ratio. We propose a neural collapse inspired framework for FSCIL. A group of classifier prototypes are pre-assigned as a simplex ETF for the whole label space, including the base session and all the incremental sessions. During training, the classifier prototypes are not learnable, and we adopt a novel loss function that drives the features into their corresponding prototypes. Theoretical analysis shows that our method holds the neural collapse optimality and does not break the feature-classifier alignment in an incremental fashion. Experiments on the miniImageNet, CUB-200, and CIFAR-100 datasets demonstrate that our proposed framework outperforms the state-of-the-art performances. Code address: https://github.com/NeuralCollapseApplications/FSCIL

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

MASIL: Towards Maximum Separable Class Representation for Few Shot Class Incremental Learning

Few Shot Class Incremental Learning (FSCIL) with few examples per class ...
research
08/18/2021

Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without Forgetting

Both generalized and incremental few-shot learning have to deal with thr...
research
08/03/2023

Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants

How to enable learnability for new classes while keeping the capability ...
research
06/19/2023

Knowledge Transfer-Driven Few-Shot Class-Incremental Learning

Few-shot class-incremental learning (FSCIL) aims to continually learn ne...
research
07/18/2022

Class-incremental Novel Class Discovery

We study the new task of class-incremental Novel Class Discovery (class-...
research
03/30/2022

Constrained Few-shot Class-incremental Learning

Continually learning new classes from fresh data without forgetting prev...
research
07/28/2022

Progressive Voronoi Diagram Subdivision: Towards A Holistic Geometric Framework for Exemplar-free Class-Incremental Learning

Exemplar-free Class-incremental Learning (CIL) is a challenging problem ...

Please sign up or login with your details

Forgot password? Click here to reset