Auto-Meta: Automated Gradient Based Meta Learner Search

06/11/2018
by   Jaehong Kim, et al.
0

Fully automating machine learning pipeline is one of the outstanding challenges of general artificial intelligence, as practical machine learning often requires costly human driven process, such as hyper-parameter tuning, algorithmic selection, and model selection. In this work, we consider the problem of executing automated, yet scalable search for finding optimal gradient based meta-learners in practice. As a solution, we apply progressive neural architecture search to proto-architectures by appealing to the model agnostic nature of general gradient based meta learners. In the presence of recent universality result of Finn et al.finn:universality_maml:DBLP:/journals/corr/abs-1710-11622, our search is a priori motivated in that neural network architecture search dynamics---automated or not---may be quite different from that of the classical setting with the same target tasks, due to the presence of the gradient update operator. A posteriori, our search algorithm, given appropriately designed search spaces, finds gradient based meta learners with non-intuitive proto-architectures that are narrowly deep, unlike the inception-like structures previously observed in the resulting architectures of traditional NAS algorithms. Along with these notable findings, the searched gradient based meta-learner achieves state-of-the-art results on the few shot classification problem on Mini-ImageNet with 76.29% accuracy, which is an 13.18% improvement over results reported in the original MAML paper. To our best knowledge, this work is the first successful AutoML implementation in the context of meta learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2019

Meta-Learning of Neural Architectures for Few-Shot Learning

The recent progress in neural architectures search (NAS) has allowed sca...
research
10/12/2021

Across-Task Neural Architecture Search via Meta Learning

Adequate labeled data and expensive compute resources are the prerequisi...
research
09/06/2019

Efficient Automatic Meta Optimization Search for Few-Shot Learning

Previous works on meta-learning either relied on elaborately hand-design...
research
06/07/2019

One-Shot Neural Architecture Search via Compressive Sensing

Neural architecture search (NAS), or automated design of neural network ...
research
02/15/2019

Fast Task-Aware Architecture Inference

Neural architecture search has been shown to hold great promise towards ...
research
03/16/2022

Meta-Learning of NAS for Few-shot Learning in Medical Image Applications

Deep learning methods have been successful in solving tasks in machine l...
research
05/03/2019

Meta-learners' learning dynamics are unlike learners'

Meta-learning is a tool that allows us to build sample-efficient learnin...

Please sign up or login with your details

Forgot password? Click here to reset