HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search

07/03/2019
by   Dounia Lakhmiri, et al.
0

The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time consuming process that is often described as a "dark art". This explains the necessity of automating the calibration of these hyperparameters. Derivative-free optimization is a field that develops methods designed to optimize time consuming functions without relying on derivatives. This work introduces the HyperNOMAD package, an extension of the NOMAD software that applies the MADS algorithm [7] to simultaneously tune the hyperparameters responsible for both the architecture and the learning process of a deep neural network (DNN), and that allows for an important flexibility in the exploration of the search space by taking advantage of categorical variables. This new approach is tested on the MNIST and CIFAR-10 data sets and achieves results comparable to the current state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2016

CMA-ES for Hyperparameter Optimization of Deep Neural Networks

Hyperparameters of deep neural networks are often optimized by grid sear...
research
03/14/2021

Use of static surrogates in hyperparameter optimization

Optimizing the hyperparameters and architecture of a neural network is a...
research
01/17/2023

Scaling Deep Networks with the Mesh Adaptive Direct Search algorithm

Deep neural networks are getting larger. Their implementation on edge an...
research
01/23/2020

Chameleon: Adaptive Code Optimization for Expedited Deep Neural Network Compilation

Achieving faster execution with shorter compilation time can foster furt...
research
11/05/2018

Deep Genetic Network

Optimizing a neural network's performance is a tedious and time taking p...
research
09/06/2023

Split-Boost Neural Networks

The calibration and training of a neural network is a complex and time-c...
research
03/27/2020

Deep-n-Cheap: An Automated Search Framework for Low Complexity Deep Learning

We present Deep-n-Cheap – an open-source AutoML framework to search for ...

Please sign up or login with your details

Forgot password? Click here to reset