Improve Unsupervised Pretraining for Few-label Transfer

07/26/2021
by   Suichan Li, et al.
7

Unsupervised pretraining has achieved great success and many recent works have shown unsupervised pretraining can achieve comparable or even slightly better transfer performance than supervised pretraining on downstream target datasets. But in this paper, we find this conclusion may not hold when the target dataset has very few labeled samples for finetuning, , few-label transfer. We analyze the possible reason from the clustering perspective: 1) The clustering quality of target samples is of great importance to few-label transfer; 2) Though contrastive learning is essential to learn how to cluster, its clustering quality is still inferior to supervised pretraining due to lack of label supervision. Based on the analysis, we interestingly discover that only involving some unlabeled target domain into the unsupervised pretraining can improve the clustering quality, subsequently reducing the transfer performance gap with supervised pretraining. This finding also motivates us to propose a new progressive few-label transfer algorithm for real applications, which aims to maximize the transfer performance under a limited annotation budget. To support our analysis and proposed method, we conduct extensive experiments on nine different target datasets. Experimental results show our proposed method can significantly boost the few-label transfer performance of unsupervised pretraining.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/10/2020

Are Fewer Labels Possible for Few-shot Learning?

Few-shot learning is challenging due to its very limited data and labels...
research
12/01/2021

Revisiting the Transferability of Supervised Pretraining: an MLP Perspective

The pretrain-finetune paradigm is a classical pipeline in visual learnin...
research
06/11/2020

What makes instance discrimination good for transfer learning?

Unsupervised visual pretraining based on the instance discrimination pre...
research
07/24/2022

A Deep Dive into Deep Cluster

Deep Learning has demonstrated a significant improvement against traditi...
research
05/25/2022

Primitive3D: 3D Object Dataset Synthesis from Randomly Assembled Primitives

Numerous advancements in deep learning can be attributed to the access t...
research
10/26/2020

Learning from Label Proportions by Optimizing Cluster Model Selection

In a supervised learning scenario, we learn a mapping from input to outp...
research
10/26/2021

Unified Instance and Knowledge Alignment Pretraining for Aspect-based Sentiment Analysis

Aspect-based Sentiment Analysis (ABSA) aims to determine the sentiment p...

Please sign up or login with your details

Forgot password? Click here to reset