Auptimizer – an Extensible, Open-Source Framework for Hyperparameter Tuning

11/06/2019
by   Jiayi Liu, et al.
0

Tuning machine learning models at scale, especially finding the right hyperparameter values, can be difficult and time-consuming. In addition to the computational effort required, this process also requires some ancillary efforts including engineering tasks (e.g., job scheduling) as well as more mundane tasks (e.g., keeping track of the various parameters and associated results). We present Auptimizer, a general Hyperparameter Optimization (HPO) framework to help data scientists speed up model tuning and bookkeeping. With Auptimizer, users can use all available computing resources in distributed settings for model training. The user-friendly system design simplifies creating, controlling, and tracking of a typical machine learning project. The design also allows researchers to integrate new HPO algorithms. To demonstrate its flexibility, we show how Auptimizer integrates a few major HPO techniques (from random search to neural architecture search). The code is available at https://github.com/LGE-ARC-AdvancedAI/auptimizer.

READ FULL TEXT
research
05/08/2020

Sherpa: Robust Hyperparameter Optimization for Machine Learning

Sherpa is a hyperparameter optimization library for machine learning mod...
research
06/03/2020

A Scalable and Cloud-Native Hyperparameter Tuning System

In this paper, we introduce Katib: a scalable, cloud-native, and product...
research
12/16/2017

NSML: A Machine Learning Platform That Enables You to Focus on Your Models

Machine learning libraries such as TensorFlow and PyTorch simplify model...
research
12/28/2019

An Open-Source Project for MapReduce Performance Self-Tuning

Many Hadoop configuration parameters have significant influence in the p...
research
07/25/2019

Optuna: A Next-generation Hyperparameter Optimization Framework

The purpose of this study is to introduce new design-criteria for next-g...
research
10/10/2022

PyHopper – Hyperparameter optimization

Hyperparameter tuning is a fundamental aspect of machine learning resear...
research
12/23/2021

Using Sequential Statistical Tests to Improve the Performance of Random Search in hyperparameter Tuning

Hyperparamter tuning is one of the the most time-consuming parts in mach...

Please sign up or login with your details

Forgot password? Click here to reset