Function Contrastive Learning of Transferable Representations

10/14/2020
by   Muhammad Waleed Gondal, et al.
8

Few-shot-learning seeks to find models that are capable of fast-adaptation to novel tasks. Unlike typical few-shot learning algorithms, we propose a contrastive learning method which is not trained to solve a set of tasks, but rather attempts to find a good representation of the underlying data-generating processes (functions). This allows for finding representations which are useful for an entire series of tasks sharing the same function. In particular, our training scheme is driven by the self-supervision signal indicating whether two sets of samples stem from the same underlying function. Our experiments on a number of synthetic and real-world datasets show that the representations we obtain can outperform strong baselines in terms of downstream performance and noise robustness, even when these baselines are trained in an end-to-end manner.

READ FULL TEXT

page 2

page 6

page 7

page 13

page 14

page 16

page 20

page 23

research
08/23/2020

Few-Shot Image Classification via Contrastive Self-Supervised Learning

Most previous few-shot learning algorithms are based on meta-training wi...
research
05/03/2022

Improving In-Context Few-Shot Learning via Self-Supervised Training

Self-supervised pretraining has made few-shot learning possible for many...
research
02/08/2021

End-to-end Generative Zero-shot Learning via Few-shot Learning

Contemporary state-of-the-art approaches to Zero-Shot Learning (ZSL) tra...
research
09/30/2022

Contrastive Graph Few-Shot Learning

Prevailing deep graph learning models often suffer from label sparsity i...
research
07/15/2021

Multi-Level Contrastive Learning for Few-Shot Problems

Contrastive learning is a discriminative approach that aims at grouping ...
research
11/08/2022

ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

Prompt learning recently become an effective linguistic tool to motivate...
research
10/14/2021

Inverse Problems Leveraging Pre-trained Contrastive Representations

We study a new family of inverse problems for recovering representations...

Please sign up or login with your details

Forgot password? Click here to reset