Optimizing Large-Scale Hyperparameters via Automated Learning Algorithm

02/17/2021
by   Bin Gu, et al.
0

Modern machine learning algorithms usually involve tuning multiple (from one to thousands) hyperparameters which play a pivotal role in terms of model generalizability. Black-box optimization and gradient-based algorithms are two dominant approaches to hyperparameter optimization while they have totally distinct advantages. How to design a new hyperparameter optimization technique inheriting all benefits from both approaches is still an open problem. To address this challenging problem, in this paper, we propose a new hyperparameter optimization method with zeroth-order hyper-gradients (HOZOG). Specifically, we first exactly formulate hyperparameter optimization as an A-based constrained optimization problem, where A is a black-box optimization algorithm (such as deep neural network). Then, we use the average zeroth-order hyper-gradients to update hyperparameters. We provide the feasibility analysis of using HOZOG to achieve hyperparameter optimization. Finally, the experimental results on three representative hyperparameter (the size is from 1 to 1250) optimization tasks demonstrate the benefits of HOZOG in terms of simplicity, scalability, flexibility, effectiveness and efficiency compared with the state-of-the-art hyperparameter optimization methods.

READ FULL TEXT
research
11/08/2018

Using Known Information to Accelerate HyperParameters Optimization Based on SMBO

Automl is the key technology for machine learning problem. Current state...
research
02/11/2015

Gradient-based Hyperparameter Optimization through Reversible Learning

Tuning hyperparameters of learning algorithms is hard because gradients ...
research
09/24/2020

Tuning Word2vec for Large Scale Recommendation Systems

Word2vec is a powerful machine learning tool that emerged from Natural L...
research
06/04/2019

Graduated Optimization of Black-Box Functions

Motivated by the problem of tuning hyperparameters in machine learning, ...
research
10/10/2022

PyHopper – Hyperparameter optimization

Hyperparameter tuning is a fundamental aspect of machine learning resear...
research
03/27/2023

Deep Ranking Ensembles for Hyperparameter Optimization

Automatically optimizing the hyperparameters of Machine Learning algorit...
research
08/14/2020

Efficient hyperparameter optimization by way of PAC-Bayes bound minimization

Identifying optimal values for a high-dimensional set of hyperparameters...

Please sign up or login with your details

Forgot password? Click here to reset