Deep Neural Network Hyperparameter Optimization with Orthogonal Array Tuning

07/31/2019
by   Xiang Zhang, et al.
0

Deep learning algorithms have achieved excellent performance lately in a wide range of fields (e.g., computer version). However, a severe challenge faced by deep learning is the high dependency on hyper-parameters. The algorithm results may fluctuate dramatically under the different configuration of hyper-parameters. Addressing the above issue, this paper presents an efficient Orthogonal Array Tuning Method (OATM) for deep learning hyper-parameter tuning. We describe the OATM approach in five detailed steps and elaborate on it using two widely used deep neural network structures (Recurrent Neural Networks and Convolutional Neural Networks). The proposed method is compared to the state-of-the-art hyper-parameter tuning methods including manually (e.g., grid search and random search) and automatically (e.g., Bayesian Optimization) ones. The experiment results state that OATM can significantly save the tuning time compared to the state-of-the-art methods while preserving the satisfying performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2020

Automatic Setting of DNN Hyper-Parameters by Mixing Bayesian Optimization and Tuning Rules

Deep learning techniques play an increasingly important role in industri...
research
11/09/2022

Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning

Choosing the values of hyper-parameters in sparse Bayesian learning (SBL...
research
12/10/2019

Deep Adaptive Wavelet Network

Even though convolutional neural networks have become the method of choi...
research
02/28/2017

Deep Forest: Towards An Alternative to Deep Neural Networks

In this paper, we propose gcForest, a decision tree ensemble approach wi...
research
11/30/2016

The observer-assisted method for adjusting hyper-parameters in deep learning algorithms

This paper presents a concept of a novel method for adjusting hyper-para...
research
02/01/2019

Hyper-parameter Tuning under a Budget Constraint

We study a budgeted hyper-parameter tuning problem, where we optimize th...
research
05/13/2022

Hyper-parameter tuning of physics-informed neural networks: Application to Helmholtz problems

We consider physics-informed neural networks [Raissi et al., J. Comput. ...

Please sign up or login with your details

Forgot password? Click here to reset