Stabilizing Bi-Level Hyperparameter Optimization using Moreau-Yosida Regularization

07/27/2020
by   Sauptik Dhar, et al.
0

This research proposes to use the Moreau-Yosida envelope to stabilize the convergence behavior of bi-level Hyperparameter optimization solvers, and introduces the new algorithm called Moreau-Yosida regularized Hyperparameter Optimization (MY-HPO) algorithm. Theoretical analysis on the correctness of the MY-HPO solution and initial convergence analysis is also provided. Our empirical results show significant improvement in loss values for a fixed computation budget, compared to the state-of-art bi-level HPO solvers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2022

Gradient-based Bi-level Optimization for Deep Learning: A Survey

Bi-level optimization, especially the gradient-based category, has been ...
research
12/02/2019

ExperienceThinking: Hyperparameter Optimization with Budget Constraints

The problem of hyperparameter optimization exists widely in the real lif...
research
04/24/2019

Reducing The Search Space For Hyperparameter Optimization Using Group Sparsity

We propose a new algorithm for hyperparameter selection in machine learn...
research
04/03/2020

Weighted Random Search for Hyperparameter Optimization

We introduce an improved version of Random Search (RS), used here for hy...
research
11/21/2022

A Bi-level Nonlinear Eigenvector Algorithm for Wasserstein Discriminant Analysis

Much like the classical Fisher linear discriminant analysis, Wasserstein...
research
02/01/2023

Iterative Deepening Hyperband

Hyperparameter optimization (HPO) is concerned with the automated search...
research
08/31/2020

BiLO-CPDP: Bi-Level Programming for Automated Model Discovery in Cross-Project Defect Prediction

Cross-Project Defect Prediction (CPDP), which borrows data from similar ...

Please sign up or login with your details

Forgot password? Click here to reset