Bayesian Hyperparameter Optimization with BoTorch, GPyTorch and Ax

12/11/2019
by   Daniel T Chang, et al.
0

Deep learning models are full of hyperparameters, which are set manually before the learning process can start. To find the best configuration for these hyperparameters in such a high dimensional space, with time-consuming and expensive model training / validation, is not a trivial challenge. Bayesian optimization is a powerful tool for the joint optimization of hyperparameters, efficiently trading off exploration and exploitation of the hyperparameter space. In this paper, we discuss Bayesian hyperparameter optimization, including hyperparameter optimization, Bayesian optimization, and Gaussian processes. We also review BoTorch, GPyTorch and Ax, the new open-source frameworks that we use for Bayesian optimization, Gaussian process inference and adaptive experimentation, respectively. For experimentation, we apply Bayesian hyperparameter optimization, for optimizing group weights, to weighted group pooling, which couples unsupervised tiered graph autoencoders learning and supervised graph classification learning for molecular graphs. We find that Ax, BoTorch and GPyTorch together provide a simple-to-use but powerful framework for Bayesian hyperparameter optimization, using Ax's high-level API that constructs and runs a full optimization loop and returns the best hyperparameter configuration.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/19/2017

Hyperparameters Optimization in Deep Convolutional Neural Network / Bayesian Approach with Gaussian Process Prior

Convolutional Neural Network is known as ConvNet have been extensively u...
research
06/30/2022

Optimizing Training Trajectories in Variational Autoencoders via Latent Bayesian Optimization Approach

Unsupervised and semi-supervised ML methods such as variational autoenco...
research
02/06/2018

Scalable Meta-Learning for Bayesian Optimization

Bayesian optimization has become a standard technique for hyperparameter...
research
02/26/2020

PHS: A Toolbox for Parallel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...
research
07/20/2020

Multi-level Training and Bayesian Optimization for Economical Hyperparameter Optimization

Hyperparameters play a critical role in the performances of many machine...
research
09/09/2019

Training Deep Neural Networks by optimizing over nonlocal paths in hyperparameter space

Hyperparameter optimization is both a practical issue and an interesting...
research
09/20/2021

SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization

Algorithm parameters, in particular hyperparameters of machine learning ...

Please sign up or login with your details

Forgot password? Click here to reset