Is Support Set Diversity Necessary for Meta-Learning?

11/28/2020
by   Amrith Setlur, et al.
0

Meta-learning is a popular framework for learning with limited data in which an algorithm is produced by training over multiple few-shot learning tasks. For classification problems, these tasks are typically constructed by sampling a small number of support and query examples from a subset of the classes. While conventional wisdom is that task diversity should improve the performance of meta-learning, in this work we find evidence to the contrary: we propose a modification to traditional meta-learning approaches in which we keep the support sets fixed across tasks, thus reducing task diversity. Surprisingly, we find that not only does this modification not result in adverse effects, it almost always improves the performance for a variety of datasets and meta-learning methods. We also provide several initial analyses to understand this phenomenon. Our work serves to: (i) more closely investigate the effect of support set construction for the problem of meta-learning, and (ii) suggest a simple, general, and competitive baseline for few-shot learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2022

The Effect of Diversity in Meta-Learning

Few-shot learning aims to learn representations that can tackle novel ta...
research
02/23/2021

Lessons from Chasing Few-Shot Learning Benchmarks: Rethinking the Evaluation of Meta-Learning Methods

In this work we introduce a simple baseline for meta-learning. Our uncon...
research
10/30/2022

Few-Shot Classification of Skin Lesions from Dermoscopic Images by Meta-Learning Representative Embeddings

Annotated images and ground truth for the diagnosis of rare and novel di...
research
11/30/2020

Revisiting Unsupervised Meta-Learning: Amplifying or Compensating for the Characteristics of Few-Shot Tasks

Meta-learning becomes a practical approach towards few-shot image classi...
research
07/18/2023

Learning to Sample Tasks for Meta Learning

Through experiments on various meta-learning methods, task samplers, and...
research
07/20/2022

Bitwidth-Adaptive Quantization-Aware Neural Network Training: A Meta-Learning Approach

Deep neural network quantization with adaptive bitwidths has gained incr...
research
01/07/2021

Few-Shot Learning with Class Imbalance

Few-shot learning aims to train models on a limited number of labeled sa...

Please sign up or login with your details

Forgot password? Click here to reset