Efficient Neural Architecture Search with Network Morphism

06/27/2018
by   Haifeng Jin, et al.
0

While neural architecture search (NAS) has drawn increasing attention for automatically tuning deep neural networks, existing search algorithms usually suffer from expensive computational cost. Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling a more efficient training during the search. However, network morphism based NAS is still computationally expensive due to the inefficient process of selecting the proper morph operation for existing architectures. As we know, Bayesian optimization has been widely used to optimize functions based on a limited number of observations, motivating us to explore the possibility of making use of Bayesian optimization to accelerate the morph operation selection process. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search by introducing a neural network kernel and a tree-structured acquisition function optimization algorithm. With Bayesian optimization to select the network morphism operations, the exploration of the search space is more efficient. Moreover, we carefully wrapped our method into an open-source software, namely Auto-Keras for people without rich machine learning background to use. Intensive experiments on real-world datasets have been done to demonstrate the superior performance of the developed framework over the state-of-the-art baseline methods.

READ FULL TEXT
research
01/27/2023

BOMP-NAS: Bayesian Optimization Mixed Precision NAS

Bayesian Optimization Mixed-Precision Neural Architecture Search (BOMP-N...
research
09/30/2022

BayesFT: Bayesian Optimization for Fault Tolerant Neural Network Architecture

To deploy deep learning algorithms on resource-limited scenarios, an eme...
research
09/06/2019

Distributed creation of Machine learning agents for Blockchain analysis

Creating efficient deep neural networks involves repetitive manual optim...
research
01/17/2020

Up to two billion times acceleration of scientific simulations with deep neural architecture search

Computer simulations are invaluable tools for scientific discovery. Howe...
research
02/20/2021

Towards Accurate and Compact Architectures via Neural Architecture Transformer

Designing effective architectures is one of the key factors behind the s...
research
06/13/2020

Neural Architecture Search using Bayesian Optimisation with Weisfeiler-Lehman Kernel

Bayesian optimisation (BO) has been widely used for hyperparameter optim...
research
05/17/2019

DeepSwarm: Optimising Convolutional Neural Networks using Swarm Intelligence

In this paper we propose DeepSwarm, a novel neural architecture search (...

Please sign up or login with your details

Forgot password? Click here to reset