Meta-Learning of Neural Architectures for Few-Shot Learning

11/25/2019
by   Thomas Elsken, et al.
0

The recent progress in neural architectures search (NAS) has allowed scaling the automated design of neural architectures to real-world domains such as object detection and semantic segmentation. However, one prerequisite for the application of NAS are large amounts of labeled data and compute resources. This renders its application challenging in few-shot learning scenarios, where many related tasks need to be learned, each with limited amounts of data and compute time. Thus, few-shot learning is typically done with a fixed neural architecture. To improve upon this, we propose MetaNAS, the first method which fully integrates NAS with gradient-based meta-learning. MetaNAS optimizes a meta-architecture along with the meta-weights during meta-training. During meta-testing, architectures can be adapted to a novel task with a few steps of the task optimizer, that is: task adaptation becomes computationally cheap and requires only little data per task. Moreover, MetaNAS is agnostic in that it can be used with arbitrary model-agnostic meta-learning algorithms and arbitrary gradient-based NAS methods. Empirical results on standard few-shot classification benchmarks show that MetaNAS with a combination of DARTS and REPTILE yields state-of-the-art results.

READ FULL TEXT
research
10/12/2021

Across-Task Neural Architecture Search via Meta Learning

Adequate labeled data and expensive compute resources are the prerequisi...
research
09/10/2021

Rapid Model Architecture Adaption for Meta-Learning

Network Architecture Search (NAS) methods have recently gathered much at...
research
12/01/2019

MetAdapt: Meta-Learned Task-Adaptive Architecture for Few-Shot Classification

Few-Shot Learning (FSL) is a topic of rapidly growing interest. Typicall...
research
09/13/2021

Meta Navigator: Search for a Good Adaptation Policy for Few-shot Learning

Few-shot learning aims to adapt knowledge learned from previous tasks to...
research
06/11/2018

Auto-Meta: Automated Gradient Based Meta Learner Search

Fully automating machine learning pipeline is one of the outstanding cha...
research
03/17/2022

Global Convergence of MAML and Theory-Inspired Neural Architecture Search for Few-Shot Learning

Model-agnostic meta-learning (MAML) and its variants have become popular...
research
12/13/2019

Meta-Learning Initializations for Image Segmentation

While meta-learning approaches that utilize neural network representatio...

Please sign up or login with your details

Forgot password? Click here to reset