Multi-Pretext Attention Network for Few-shot Learning with Self-supervision

03/10/2021
by   Hainan Li, et al.
13

Few-shot learning is an interesting and challenging study, which enables machines to learn from few samples like humans. Existing studies rarely exploit auxiliary information from large amount of unlabeled data. Self-supervised learning is emerged as an efficient method to utilize unlabeled data. Existing self-supervised learning methods always rely on the combination of geometric transformations for the single sample by augmentation, while seriously neglect the endogenous correlation information among different samples that is the same important for the task. In this work, we propose a Graph-driven Clustering (GC), a novel augmentation-free method for self-supervised learning, which does not rely on any auxiliary sample and utilizes the endogenous correlation information among input samples. Besides, we propose Multi-pretext Attention Network (MAN), which exploits a specific attention mechanism to combine the traditional augmentation-relied methods and our GC, adaptively learning their optimized weights to improve the performance and enabling the feature extractor to obtain more universal representations. We evaluate our MAN extensively on miniImageNet and tieredImageNet datasets and the results demonstrate that the proposed method outperforms the state-of-the-art (SOTA) relevant methods.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/12/2019

Boosting Few-Shot Visual Learning with Self-Supervision

Few-shot learning and self-supervised learning address different facets ...
10/06/2020

Shot in the Dark: Few-Shot Learning with No Base-Class Labels

Few-shot learning aims to learn classifiers for new objects from a small...
07/21/2020

Exploiting Temporal Coherence for Self-Supervised One-shot Video Re-identification

While supervised techniques in re-identification are extremely effective...
04/16/2021

Pareto Self-Supervised Training for Few-Shot Learning

While few-shot learning (FSL) aims for rapid generalization to new conce...
05/21/2021

Backdoor Attacks on Self-Supervised Learning

Large-scale unlabeled data has allowed recent progress in self-supervise...
04/01/2022

Selecting task with optimal transport self-supervised learning for few-shot classification

Few-Shot classification aims at solving problems that only a few samples...
02/07/2022

Reasoning for Complex Data through Ensemble-based Self-Supervised Learning

Self-supervised learning deals with problems that have little or no avai...

Code Repositories

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.