Learning to Rank Learning Curves

06/05/2020
by   Martin Wistuba, et al.
0

Many automated machine learning methods, such as those for hyperparameter and neural architecture optimization, are computationally expensive because they involve training many different model configurations. In this work, we present a new method that saves computational budget by terminating poor configurations early on in the training. In contrast to existing methods, we consider this task as a ranking and transfer learning problem. We qualitatively show that by optimizing a pairwise ranking loss and leveraging learning curves from other datasets, our model is able to effectively rank learning curves without having to observe many or very long learning curves. We further demonstrate that our method can be used to accelerate a neural architecture search by a factor of up to 100 without a significant performance degradation of the discovered architecture. In further experiments we analyze the quality of ranking, the influence of different model components as well as the predictive behavior of the model.

READ FULL TEXT

page 8

page 13

research
03/08/2019

Inductive Transfer for Neural Architecture Optimization

The recent advent of automated neural network architecture search led to...
research
10/07/2021

Conceptual Expansion Neural Architecture Search (CENAS)

Architecture search optimizes the structure of a neural network for some...
research
01/25/2023

Learning to Rank Normalized Entropy Curves with Differentiable Window Transformation

Recent automated machine learning systems often use learning curves rank...
research
03/27/2023

Deep Ranking Ensembles for Hyperparameter Optimization

Automatically optimizing the hyperparameters of Machine Learning algorit...
research
05/30/2017

Accelerating Neural Architecture Search using Performance Prediction

Methods for neural network hyperparameter optimization and meta-modeling...
research
09/23/2022

NasHD: Efficient ViT Architecture Performance Ranking using Hyperdimensional Computing

Neural Architecture Search (NAS) is an automated architecture engineerin...
research
03/14/2021

Use of static surrogates in hyperparameter optimization

Optimizing the hyperparameters and architecture of a neural network is a...

Please sign up or login with your details

Forgot password? Click here to reset