PyHopper – Hyperparameter optimization

10/10/2022
by   Mathias Lechner, et al.
0

Hyperparameter tuning is a fundamental aspect of machine learning research. Setting up the infrastructure for systematic optimization of hyperparameters can take a significant amount of time. Here, we present PyHopper, a black-box optimization platform designed to streamline the hyperparameter tuning workflow of machine learning researchers. PyHopper's goal is to integrate with existing code with minimal effort and run the optimization process with minimal necessary manual oversight. With simplicity as the primary theme, PyHopper is powered by a single robust Markov-chain Monte-Carlo optimization algorithm that scales to millions of dimensions. Compared to existing tuning packages, focusing on a single algorithm frees the user from having to decide between several algorithms and makes PyHopper easily customizable. PyHopper is publicly available under the Apache-2.0 license at https://github.com/PyHopper/PyHopper.

READ FULL TEXT
research
05/08/2020

Sherpa: Robust Hyperparameter Optimization for Machine Learning

Sherpa is a hyperparameter optimization library for machine learning mod...
research
02/17/2021

Optimizing Large-Scale Hyperparameters via Automated Learning Algorithm

Modern machine learning algorithms usually involve tuning multiple (from...
research
07/10/2023

SigOpt Mulch: An Intelligent System for AutoML of Gradient Boosted Trees

Gradient boosted trees (GBTs) are ubiquitous models used by researchers,...
research
11/06/2019

Auptimizer – an Extensible, Open-Source Framework for Hyperparameter Tuning

Tuning machine learning models at scale, especially finding the right hy...
research
06/03/2020

A Scalable and Cloud-Native Hyperparameter Tuning System

In this paper, we introduce Katib: a scalable, cloud-native, and product...
research
05/23/2019

DEEP-BO for Hyperparameter Optimization of Deep Networks

The performance of deep neural networks (DNN) is very sensitive to the p...
research
12/02/2021

HMC with Normalizing Flows

We propose using Normalizing Flows as a trainable kernel within the mole...

Please sign up or login with your details

Forgot password? Click here to reset