Hyperparameter Optimization in Neural Networks via Structured Sparse Recovery

07/07/2020
by   Minsu Cho, et al.
40

In this paper, we study two important problems in the automated design of neural networks – Hyper-parameter Optimization (HPO), and Neural Architecture Search (NAS) – through the lens of sparse recovery methods. In the first part of this paper, we establish a novel connection between HPO and structured sparse recovery. In particular, we show that a special encoding of the hyperparameter space enables a natural group-sparse recovery formulation, which when coupled with HyperBand (a multi-armed bandit strategy), leads to improvement over existing hyperparameter optimization methods. Experimental results on image datasets such as CIFAR-10 confirm the benefits of our approach. In the second part of this paper, we establish a connection between NAS and structured sparse recovery. Building upon “one-shot” approaches in NAS, we propose a novel algorithm that we call CoNAS by merging ideas from one-shot approaches with a techniques for learning low-degree sparse Boolean polynomials. We provide theoretical analysis on the number of validation error measurements. Finally, we validate our approach on several datasets and discover novel architectures hitherto unreported, achieving competitive (or better) results in both performance and search time compared to the existing NAS approaches.

READ FULL TEXT

page 7

page 8

page 9

page 10

research
04/24/2019

Reducing The Search Space For Hyperparameter Optimization Using Group Sparsity

We propose a new algorithm for hyperparameter selection in machine learn...
research
06/07/2019

One-Shot Neural Architecture Search via Compressive Sensing

Neural architecture search (NAS), or automated design of neural network ...
research
12/11/2020

AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment

Neural architecture search (NAS) is an approach for automatically design...
research
01/01/2021

Neural Architecture Search via Combinatorial Multi-Armed Bandit

Neural Architecture Search (NAS) has gained significant popularity as an...
research
07/11/2020

An Asymptotically Optimal Multi-Armed Bandit Algorithm and Hyperparameter Optimization

The evaluation of hyperparameters, neural architectures, or data augment...
research
04/08/2021

A Design Space Study for LISTA and Beyond

In recent years, great success has been witnessed in building problem-sp...
research
07/01/2019

Single-Path Mobile AutoML: Efficient ConvNet Design and NAS Hyperparameter Optimization

Can we reduce the search cost of Neural Architecture Search (NAS) from d...

Please sign up or login with your details

Forgot password? Click here to reset