Unsupervised Meta-Learning via Few-shot Pseudo-supervised Contrastive Learning

03/02/2023
by   Huiwon Jang, et al.
0

Unsupervised meta-learning aims to learn generalizable knowledge across a distribution of tasks constructed from unlabeled data. Here, the main challenge is how to construct diverse tasks for meta-learning without label information; recent works have proposed to create, e.g., pseudo-labeling via pretrained representations or creating synthetic samples via generative models. However, such a task construction strategy is fundamentally limited due to heavy reliance on the immutable pseudo-labels during meta-learning and the quality of the representations or the generated samples. To overcome the limitations, we propose a simple yet effective unsupervised meta-learning framework, coined Pseudo-supervised Contrast (PsCo), for few-shot classification. We are inspired by the recent self-supervised learning literature; PsCo utilizes a momentum network and a queue of previous batches to improve pseudo-labeling and construct diverse tasks in a progressive manner. Our extensive experiments demonstrate that PsCo outperforms existing unsupervised meta-learning methods under various in-domain and cross-domain few-shot classification benchmarks. We also validate that PsCo is easily scalable to a large-scale benchmark, while recent prior-art meta-schemes are not.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2022

Pseudo-Labeling Based Practical Semi-Supervised Meta-Training for Few-Shot Learning

Most existing few-shot learning (FSL) methods require a large amount of ...
research
07/11/2020

Coarse-to-Fine Pseudo-Labeling Guided Meta-Learning for Few-Shot Classification

To endow neural networks with the potential to learn rapidly from a hand...
research
04/21/2023

Task-Adaptive Pseudo Labeling for Transductive Meta-Learning

Meta-learning performs adaptation through a limited amount of support se...
research
09/27/2022

Rethinking Clustering-Based Pseudo-Labeling for Unsupervised Meta-Learning

The pioneering method for unsupervised meta-learning, CACTUs, is a clust...
research
08/27/2019

MetaMixUp: Learning Adaptive Interpolation Policy of MixUp with Meta-Learning

MixUp is an effective data augmentation method to regularize deep neural...
research
08/09/2021

The Role of Global Labels in Few-Shot Classification and How to Infer Them

Few-shot learning (FSL) is a central problem in meta-learning, where lea...
research
05/27/2021

ProtAugment: Unsupervised diverse short-texts paraphrasing for intent detection meta-learning

Recent research considers few-shot intent detection as a meta-learning p...

Please sign up or login with your details

Forgot password? Click here to reset