The Curse of Low Task Diversity: On the Failure of Transfer Learning to Outperform MAML and Their Empirical Equivalence

08/02/2022
by   Brando Miranda, et al.
0

Recently, it has been observed that a transfer learning solution might be all we need to solve many few-shot learning benchmarks – thus raising important questions about when and how meta-learning algorithms should be deployed. In this paper, we seek to clarify these questions by 1. proposing a novel metric – the diversity coefficient – to measure the diversity of tasks in a few-shot learning benchmark and 2. by comparing Model-Agnostic Meta-Learning (MAML) and transfer learning under fair conditions (same architecture, same optimizer, and all models trained to convergence). Using the diversity coefficient, we show that the popular MiniImageNet and CIFAR-FS few-shot learning benchmarks have low diversity. This novel insight contextualizes claims that transfer learning solutions are better than meta-learned solutions in the regime of low diversity under a fair comparison. Specifically, we empirically find that a low diversity coefficient correlates with a high similarity between transfer learning and MAML learned solutions in terms of accuracy at meta-test time and classification layer similarity (using feature based distance metrics like SVCCA, PWCCA, CKA, and OPD). To further support our claim, we find this meta-test accuracy holds even as the model size changes. Therefore, we conclude that in the low diversity regime, MAML and transfer learning have equivalent meta-test performance when both are compared fairly. We also hope our work inspires more thoughtful constructions and quantitative evaluations of meta-learning benchmarks in the future.

READ FULL TEXT

page 21

page 22

page 23

page 25

research
12/24/2021

The Curse of Zero Task Diversity: On the Failure of Transfer Learning to Outperform MAML and their Empirical Equivalence

Recently, it has been observed that a transfer learning solution might b...
research
06/24/2023

Is Pre-training Truly Better Than Meta-Learning?

In the context of few-shot learning, it is currently believed that a fix...
research
04/26/2022

Meta-free few-shot learning via representation learning with weight averaging

Recent studies on few-shot classification using transfer learning pose c...
research
04/06/2021

Comparing Transfer and Meta Learning Approaches on a Unified Few-Shot Classification Benchmark

Meta and transfer learning are two successful families of approaches to ...
research
12/24/2021

Does MAML Only Work via Feature Re-use? A Data Centric Perspective

Recent work has suggested that a good embedding is all we need to solve ...
research
06/06/2021

DAMSL: Domain Agnostic Meta Score-based Learning

In this paper, we propose Domain Agnostic Meta Score-based Learning (DAM...
research
01/13/2022

Beyond Simple Meta-Learning: Multi-Purpose Models for Multi-Domain, Active and Continual Few-Shot Learning

Modern deep learning requires large-scale extensively labelled datasets ...

Please sign up or login with your details

Forgot password? Click here to reset