Inductive Transfer for Neural Architecture Optimization

03/08/2019
by   Martin Wistuba, et al.
0

The recent advent of automated neural network architecture search led to several methods that outperform state-of-the-art human-designed architectures. However, these approaches are computationally expensive, in extreme cases consuming GPU years. We propose two novel methods which aim to expedite this optimization problem by transferring knowledge acquired from previous tasks to new ones. First, we propose a novel neural architecture selection method which employs this knowledge to identify strong and weak characteristics of neural architectures across datasets. Thus, these characteristics do not need to be rediscovered in every search, a strong weakness of current state-of-the-art searches. Second, we propose a method for learning curve extrapolation to determine if a training process can be terminated early. In contrast to existing work, we propose to learn from learning curves of architectures trained on other datasets to improve the prediction accuracy for novel datasets. On five different image classification benchmarks, we empirically demonstrate that both of our orthogonal contributions independently lead to an acceleration, without any significant loss in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2018

MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning

Recent studies on neural architecture search have shown that automatical...
research
06/05/2020

Learning to Rank Learning Curves

Many automated machine learning methods, such as those for hyperparamete...
research
05/26/2021

On the Exploitation of Neuroevolutionary Information: Analyzing the Past for a More Efficient Future

Neuroevolutionary algorithms, automatic searches of neural network struc...
research
06/19/2019

XNAS: Neural Architecture Search with Expert Advice

This paper introduces a novel optimization method for differential neura...
research
07/18/2019

XferNAS: Transfer Neural Architecture Search

The term Neural Architecture Search (NAS) refers to the automatic optimi...
research
12/28/2022

Breaking the Architecture Barrier: A Method for Efficient Knowledge Transfer Across Networks

Transfer learning is a popular technique for improving the performance o...
research
10/16/2021

BNAS v2: Learning Architectures for Binary Networks with Empirical Improvements

Backbone architectures of most binary networks are well-known floating p...

Please sign up or login with your details

Forgot password? Click here to reset