SigOpt Mulch: An Intelligent System for AutoML of Gradient Boosted Trees

07/10/2023
by   Aleksei Sorokin, et al.
0

Gradient boosted trees (GBTs) are ubiquitous models used by researchers, machine learning (ML) practitioners, and data scientists because of their robust performance, interpretable behavior, and ease-of-use. One critical challenge in training GBTs is the tuning of their hyperparameters. In practice, selecting these hyperparameters is often done manually. Recently, the ML community has advocated for tuning hyperparameters through black-box optimization and developed state-of-the-art systems to do so. However, applying such systems to tune GBTs suffers from two drawbacks. First, these systems are not model-aware, rather they are designed to apply to a generic model; this leaves significant optimization performance on the table. Second, using these systems requires domain knowledge such as the choice of hyperparameter search space, which is an antithesis to the automatic experimentation that black-box optimization aims to provide. In this paper, we present SigOpt Mulch, a model-aware hyperparameter tuning system specifically designed for automated tuning of GBTs that provides two improvements over existing systems. First, Mulch leverages powerful techniques in metalearning and multifidelity optimization to perform model-aware hyperparameter optimization. Second, it automates the process of learning performant hyperparameters by making intelligent decisions about the optimization search space, thus reducing the need for user domain knowledge. These innovations allow Mulch to identify good GBT hyperparameters far more efficiently – and in a more seamless and user-friendly way – than existing black-box hyperparameter tuning systems.

READ FULL TEXT
research
05/25/2018

Parallel Architecture and Hyperparameter Search via Successive Halving and Classification

We present a simple and powerful algorithm for parallel black box optimi...
research
12/15/2020

Amazon SageMaker Automatic Model Tuning: Scalable Black-box Optimization

Tuning complex machine learning systems is challenging. Machine learning...
research
12/15/2021

Predicting the utility of search spaces for black-box optimization: a simple, budget-aware approach

Black box optimization requires specifying a search space to explore for...
research
10/10/2022

PyHopper – Hyperparameter optimization

Hyperparameter tuning is a fundamental aspect of machine learning resear...
research
07/07/2022

Pre-training helps Bayesian optimization too

Bayesian optimization (BO) has become a popular strategy for global opti...
research
06/11/2021

HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML

Hyperparameter optimization (HPO) is a core problem for the machine lear...

Please sign up or login with your details

Forgot password? Click here to reset