Are Fewer Labels Possible for Few-shot Learning?

12/10/2020
by   Suichan Li, et al.
12

Few-shot learning is challenging due to its very limited data and labels. Recent studies in big transfer (BiT) show that few-shot learning can greatly benefit from pretraining on large scale labeled dataset in a different domain. This paper asks a more challenging question: "can we use as few as possible labels for few-shot learning in both pretraining (with no labels) and fine-tuning (with fewer labels)?". Our key insight is that the clustering of target samples in the feature space is all we need for few-shot finetuning. It explains why the vanilla unsupervised pretraining (poor clustering) is worse than the supervised one. In this paper, we propose transductive unsupervised pretraining that achieves a better clustering by involving target data even though its amount is very limited. The improved clustering result is of great value for identifying the most representative samples ("eigen-samples") for users to label, and in return, continued finetuning with the labeled eigen-samples further improves the clustering. Thus, we propose eigen-finetuning to enable fewer shot learning by leveraging the co-evolution of clustering and eigen-samples in the finetuning. We conduct experiments on 10 different few-shot target datasets, and our average few-shot performance outperforms both vanilla inductive unsupervised transfer and supervised transfer by a large margin. For instance, when each target category only has 10 labeled samples, the mean accuracy gain over the above two baselines is 9.2

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2021

Improve Unsupervised Pretraining for Few-label Transfer

Unsupervised pretraining has achieved great success and many recent work...
research
10/07/2022

Unsupervised Few-shot Learning via Deep Laplacian Eigenmaps

Learning a new task from a handful of examples remains an open challenge...
research
06/21/2022

Few-Max: Few-Shot Domain Adaptation for Unsupervised Contrastive Representation Learning

Contrastive self-supervised learning methods learn to map data points su...
research
02/26/2019

Assume, Augment and Learn: Unsupervised Few-Shot Meta-Learning via Random Labels and Data Augmentation

The field of few-shot learning has been laboriously explored in the supe...
research
07/22/2021

External-Memory Networks for Low-Shot Learning of Targets in Forward-Looking-Sonar Imagery

We propose a memory-based framework for real-time, data-efficient target...
research
02/24/2021

Enabling the Network to Surf the Internet

Few-shot learning is challenging due to the limited data and labels. Exi...
research
04/13/2023

LSFSL: Leveraging Shape Information in Few-shot Learning

Few-shot learning (FSL) techniques seek to learn the underlying patterns...

Please sign up or login with your details

Forgot password? Click here to reset