Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants

08/03/2023
by   Yibo Yang, et al.
0

How to enable learnability for new classes while keeping the capability well on old classes has been a crucial challenge for class incremental learning. Beyond the normal case, long-tail class incremental learning and few-shot class incremental learning are also proposed to consider the data imbalance and data scarcity, respectively, which are common in real-world implementations and further exacerbate the well-known problem of catastrophic forgetting. Existing methods are specifically proposed for one of the three tasks. In this paper, we offer a unified solution to the misalignment dilemma in the three tasks. Concretely, we propose neural collapse terminus that is a fixed structure with the maximal equiangular inter-class separation for the whole label space. It serves as a consistent target throughout the incremental training to avoid dividing the feature space incrementally. For CIL and LTCIL, we further propose a prototype evolving scheme to drive the backbone features into our neural collapse terminus smoothly. Our method also works for FSCIL with only minor adaptations. Theoretical analysis indicates that our method holds the neural collapse optimality in an incremental fashion regardless of data imbalance or data scarcity. We also design a generalized case where we do not know the total number of classes and whether the data distribution is normal, long-tail, or few-shot for each coming session, to test the generalizability of our method. Extensive experiments with multiple datasets are conducted to demonstrate the effectiveness of our unified solution to all the three tasks and the generalized case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2020

Learning to Segment the Tail

Real-world visual recognition requires handling the extreme sample imbal...
research
02/06/2023

Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class Incremental Learning

Few-shot class-incremental learning (FSCIL) has been a challenging probl...
research
06/30/2022

Multi-Granularity Regularized Re-Balancing for Class Incremental Learning

Deep learning models suffer from catastrophic forgetting when learning n...
research
05/20/2019

Label Mapping Neural Networks with Response Consolidation for Class Incremental Learning

Class incremental learning refers to a special multi-class classificatio...
research
06/28/2020

Few-Shot Class-Incremental Learning via Feature Space Composition

As a challenging problem in machine learning, few-shot class-incremental...
research
05/27/2022

Geometer: Graph Few-Shot Class-Incremental Learning via Prototype Representation

With the tremendous expansion of graphs data, node classification shows ...
research
11/20/2020

Sequential Targeting: an incremental learning approach for data imbalance in text classification

Classification tasks require a balanced distribution of data to ensure t...

Please sign up or login with your details

Forgot password? Click here to reset