DeepAI AI Chat
Log In Sign Up

Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks

by   Tobias Hinz, et al.

Most learning algorithms require the practitioner to manually set the values of many hyperparameters before the learning process can begin. However, with modern algorithms, the evaluation of a given hyperparameter setting can take a considerable amount of time and the search space is often very high-dimensional. We suggest using a lower-dimensional representation of the original data to quickly identify promising areas in the hyperparameter space. This information can then be used to initialize the optimization algorithm for the original, higher-dimensional data. We compare this approach with the standard procedure of optimizing the hyperparameters only on the original input. We perform experiments with various state-of-the-art hyperparameter optimization algorithms such as random search, the tree of parzen estimators (TPEs), sequential model-based algorithm configuration (SMAC), and a genetic algorithm (GA). Our experiments indicate that it is possible to speed up the optimization process by using lower-dimensional data representations at the beginning, while increasing the dimensionality of the input later in the optimization process. This is independent of the underlying optimization procedure, making the approach promising for many existing hyperparameter optimization algorithms.


Weighted Random Search for CNN Hyperparameter Optimization

Nearly all model algorithms used in machine learning use two different s...

Deep Genetic Network

Optimizing a neural network's performance is a tedious and time taking p...

Using Sequential Statistical Tests to Improve the Performance of Random Search in hyperparameter Tuning

Hyperparamter tuning is one of the the most time-consuming parts in mach...

Efficient Hyperparameter Optimization in Deep Learning Using a Variable Length Genetic Algorithm

Convolutional Neural Networks (CNN) have gained great success in many ar...

Genetic-algorithm-optimized neural networks for gravitational wave classification

Gravitational-wave detection strategies are based on a signal analysis t...

HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural Networks

Computational chemistry has become an important tool to predict and unde...

Online hyperparameter optimization by real-time recurrent learning

Conventional hyperparameter optimization methods are computationally int...