Exemplar-Based Contrastive Self-Supervised Learning with Few-Shot Class Incremental Learning
Humans are capable of learning new concepts from only a few (labeled) exemplars, incrementally and continually. This happens within the context that we can differentiate among the exemplars, and between the exemplars and large amounts of other data (unlabeled and labeled). This suggests, in human learning, supervised learning of concepts based on exemplars takes place within the larger context of contrastive self-supervised learning (CSSL) based on unlabeled and labeled data. We discuss extending CSSL (1) to be based mainly on exemplars and only secondly on data augmentation, and (2) to apply to both unlabeled data (a large amount is available in general) and labeled data (a few exemplars can be obtained with valuable supervised knowledge). A major benefit of the extensions is that exemplar-based CSSL, with supervised finetuning, supports few-shot class incremental learning (CIL). Specifically, we discuss exemplar-based CSSL including: nearest-neighbor CSSL, neighborhood CSSL with supervised pretraining, and exemplar CSSL with supervised finetuning. We further discuss using exemplar-based CSSL to facilitate few-shot learning and, in particular, few-shot CIL.
READ FULL TEXT