Constrained Few-shot Class-incremental Learning

03/30/2022
by   Michael Hersche, et al.
0

Continually learning new classes from fresh data without forgetting previous knowledge of old classes is a very challenging research problem. Moreover, it is imperative that such learning must respect certain memory and computational constraints such as (i) training samples are limited to only a few per class, (ii) the computational cost of learning a novel class remains constant, and (iii) the memory footprint of the model grows at most linearly with the number of classes observed. To meet the above constraints, we propose C-FSCIL, which is architecturally composed of a frozen meta-learned feature extractor, a trainable fixed-size fully connected layer, and a rewritable dynamically growing memory that stores as many vectors as the number of encountered classes. C-FSCIL provides three update modes that offer a trade-off between accuracy and compute-memory cost of learning novel classes. C-FSCIL exploits hyperdimensional embedding that allows to continually express many more classes than the fixed dimensions in the vector space, with minimal interference. The quality of class vector representations is further improved by aligning them quasi-orthogonally to each other by means of novel loss functions. Experiments on the CIFAR100, miniImageNet, and Omniglot datasets show that C-FSCIL outperforms the baselines with remarkable accuracy and compression. It also scales up to the largest problem size ever tried in this few-shot setting by learning 423 novel classes on top of 1200 base classes with less than 1.6 accuracy drop. Our code is available at https://github.com/IBM/constrained-FSCIL.

READ FULL TEXT
research
04/02/2023

Learning with Fantasy: Semantic-Aware Virtual Contrastive Constraint for Few-Shot Class-Incremental Learning

Few-shot class-incremental learning (FSCIL) aims at learning to classify...
research
03/24/2023

Class-Incremental Exemplar Compression for Class-Incremental Learning

Exemplar-based class-incremental learning (CIL) finetunes the model with...
research
03/14/2022

Forward Compatible Few-Shot Class-Incremental Learning

Novel classes frequently arise in our dynamically changing world, e.g., ...
research
04/08/2023

MASIL: Towards Maximum Separable Class Representation for Few Shot Class Incremental Learning

Few Shot Class Incremental Learning (FSCIL) with few examples per class ...
research
02/06/2023

Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class Incremental Learning

Few-shot class-incremental learning (FSCIL) has been a challenging probl...
research
06/11/2023

Compositional Prototypical Networks for Few-Shot Classification

It is assumed that pre-training provides the feature extractor with stro...
research
08/18/2023

NAPA-VQ: Neighborhood Aware Prototype Augmentation with Vector Quantization for Continual Learning

Catastrophic forgetting; the loss of old knowledge upon acquiring new kn...

Please sign up or login with your details

Forgot password? Click here to reset