Use of static surrogates in hyperparameter optimization

03/14/2021
by   Dounia Lakhmiri, et al.
0

Optimizing the hyperparameters and architecture of a neural network is a long yet necessary phase in the development of any new application. This consuming process can benefit from the elaboration of strategies designed to quickly discard low quality configurations and focus on more promising candidates. This work aims at enhancing HyperNOMAD, a library that adapts a direct search derivative-free optimization algorithm to tune both the architecture and the training of a neural network simultaneously, by targeting two keys steps of its execution and exploiting cheap approximations in the form of static surrogates to trigger the early stopping of the evaluation of a configuration and the ranking of pools of candidates. These additions to HyperNOMAD are shown to improve on its resources consumption without harming the quality of the proposed solutions.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

07/03/2019

HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search

The performance of deep neural networks is highly sensitive to the choic...
07/19/2018

Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks

Most learning algorithms require the practitioner to manually set the va...
02/15/2021

Online hyperparameter optimization by real-time recurrent learning

Conventional hyperparameter optimization methods are computationally int...
05/30/2017

Accelerating Neural Architecture Search using Performance Prediction

Methods for neural network hyperparameter optimization and meta-modeling...
05/13/2019

Tabular Benchmarks for Joint Architecture and Hyperparameter Optimization

Due to the high computational demands executing a rigorous comparison be...
06/05/2020

Learning to Rank Learning Curves

Many automated machine learning methods, such as those for hyperparamete...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.