Reducing The Search Space For Hyperparameter Optimization Using Group Sparsity

04/24/2019
by   Minsu Cho, et al.
0

We propose a new algorithm for hyperparameter selection in machine learning algorithms. The algorithm is a novel modification of Harmonica, a spectral hyperparameter selection approach using sparse recovery methods. In particular, we show that a special encoding of hyperparameter space enables a natural group-sparse recovery formulation, which when coupled with HyperBand (a multi-armed bandit strategy) leads to improvement over existing hyperparameter optimization methods such as Successive Halving and Random Search. Experimental results on image datasets such as CIFAR-10 confirm the benefits of our approach.

READ FULL TEXT
research
07/07/2020

Hyperparameter Optimization in Neural Networks via Structured Sparse Recovery

In this paper, we study two important problems in the automated design o...
research
04/20/2023

PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces

The recent rise in popularity of Hyperparameter Optimization (HPO) for d...
research
07/11/2020

An Asymptotically Optimal Multi-Armed Bandit Algorithm and Hyperparameter Optimization

The evaluation of hyperparameters, neural architectures, or data augment...
research
12/16/2017

NSML: A Machine Learning Platform That Enables You to Focus on Your Models

Machine learning libraries such as TensorFlow and PyTorch simplify model...
research
07/27/2020

Stabilizing Bi-Level Hyperparameter Optimization using Moreau-Yosida Regularization

This research proposes to use the Moreau-Yosida envelope to stabilize th...
research
09/16/2019

A Tsetlin Machine with Multigranular Clauses

The recently introduced Tsetlin Machine (TM) has provided competitive pa...
research
10/25/2020

Hyperparameter Transfer Across Developer Adjustments

After developer adjustments to a machine learning (ML) algorithm, how ca...

Please sign up or login with your details

Forgot password? Click here to reset