Dynamic Surrogate Switching: Sample-Efficient Search for Factorization Machine Configurations in Online Recommendations

09/29/2022
by   Blaž Škrlj, et al.
0

Hyperparameter optimization is the process of identifying the appropriate hyperparameter configuration of a given machine learning model with regard to a given learning task. For smaller data sets, an exhaustive search is possible; However, when the data size and model complexity increase, the number of configuration evaluations becomes the main computational bottleneck. A promising paradigm for tackling this type of problem is surrogate-based optimization. The main idea underlying this paradigm considers an incrementally updated model of the relation between the hyperparameter space and the output (target) space; the data for this model are obtained by evaluating the main learning engine, which is, for example, a factorization machine-based model. By learning to approximate the hyperparameter-target relation, the surrogate (machine learning) model can be used to score large amounts of hyperparameter configurations, exploring parts of the configuration space beyond the reach of direct machine learning engine evaluation. Commonly, a surrogate is selected prior to optimization initialization and remains the same during the search. We investigated whether dynamic switching of surrogates during the optimization itself is a sensible idea of practical relevance for selecting the most appropriate factorization machine-based models for large-scale online recommendation. We conducted benchmarks on data sets containing hundreds of millions of instances against established baselines such as Random Forest- and Gaussian process-based surrogates. The results indicate that surrogate switching can offer good performance while considering fewer learning engine evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2019

Multi-level CNN for lung nodule classification with Gaussian Process assisted hyperparameter optimization

This paper investigates lung nodule classification by using deep neural ...
research
01/08/2020

HyperSched: Dynamic Resource Reallocation for Model Development on a Deadline

Prior research in resource scheduling for machine learning training work...
research
12/02/2019

ExperienceThinking: Hyperparameter Optimization with Budget Constraints

The problem of hyperparameter optimization exists widely in the real lif...
research
09/22/2019

MaLTESE: Large-Scale Simulation-Driven Machine Learning for Transient Driving Cycles

Optimal engine operation during a transient driving cycle is the key to ...
research
01/07/2021

An automated machine learning-genetic algorithm (AutoML-GA) approach for efficient simulation-driven engine design optimization

In recent years, the use of machine learning techniques as surrogate mod...
research
03/30/2017

Efficient Benchmarking of Algorithm Configuration Procedures via Model-Based Surrogates

The optimization of algorithm (hyper-)parameters is crucial for achievin...
research
11/08/2021

Explaining Hyperparameter Optimization via Partial Dependence Plots

Automated hyperparameter optimization (HPO) can support practitioners to...

Please sign up or login with your details

Forgot password? Click here to reset