Shot in the Dark: Few-Shot Learning with No Base-Class Labels

10/06/2020
by   Zitian Chen, et al.
18

Few-shot learning aims to learn classifiers for new objects from a small number of labeled examples. But it does not do this in a vacuum. Usually, a strong inductive bias is borrowed from the supervised learning of base classes. This inductive bias enables more statistically efficient learning of the new classes. In this work, we show that no labels are needed to develop such an inductive bias, and that self-supervised learning can provide a powerful inductive bias for few-shot learning. This is particularly effective when the unlabeled data for learning such a bias contains not only examples of the base classes, but also examples of the novel classes. The setting in which unlabeled examples of the novel classes are available is known as the transductive setting. Our method outperforms state-of-the-art few-shot learning methods, including other transductive learning methods, by 3.9 miniImageNet without using any base class labels. By benchmarking unlabeled-base-class (UBC) few-shot learning and UBC transductive few-shot learning, we demonstrate the great potential of self-supervised feature learning: self-supervision alone is sufficient to create a remarkably good inductive bias for few-shot learning. This motivates a rethinking of whether base-class labels are necessary at all for few-shot learning. We also explore the relationship between self-supervised features and supervised features, comparing both their transferability and their complementarity in the non-transductive setting. By combining supervised and self-supervised features learned from base classes, we also achieve a new state-of-the-art in the non-transductive setting, outperforming all previous methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2021

Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images

Few-shot learning is a challenging task since only few instances are giv...
research
08/10/2020

Cooperative Bi-path Metric for Few-shot Learning

Given base classes with sufficient labeled samples, the target of few-sh...
research
11/18/2022

Weighted Ensemble Self-Supervised Learning

Ensembling has proven to be a powerful technique for boosting model perf...
research
03/10/2021

Multi-Pretext Attention Network for Few-shot Learning with Self-supervision

Few-shot learning is an interesting and challenging study, which enables...
research
12/26/2020

Few Shot Learning With No Labels

Few-shot learners aim to recognize new categories given only a small num...
research
04/16/2021

Pareto Self-Supervised Training for Few-Shot Learning

While few-shot learning (FSL) aims for rapid generalization to new conce...
research
11/15/2021

Multimodal Generalized Zero Shot Learning for Gleason Grading using Self-Supervised Learning

Gleason grading from histopathology images is essential for accurate pro...

Please sign up or login with your details

Forgot password? Click here to reset