Far-HO: A Bilevel Programming Package for Hyperparameter Optimization and Meta-Learning

06/13/2018
by   Luca Franceschi, et al.
0

In (Franceschi et al., 2018) we proposed a unified mathematical framework, grounded on bilevel programming, that encompasses gradient-based hyperparameter optimization and meta-learning. We formulated an approximate version of the problem where the inner objective is solved iteratively, and gave sufficient conditions ensuring convergence to the exact problem. In this work we show how to optimize learning rates, automatically weight the loss of single examples and learn hyper-representations with Far-HO, a software package based on the popular deep learning framework TensorFlow that allows to seamlessly tackle both HO and ML problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2018

Bilevel Programming for Hyperparameter Optimization and Meta-Learning

We introduce a framework based on bilevel programming that unifies gradi...
research
10/25/2018

Truncated Back-propagation for Bilevel Optimization

Bilevel optimization has been recently revisited for designing and analy...
research
12/18/2017

A Bridge Between Hyperparameter Optimization and Larning-to-learn

We consider a class of a nested optimization problems involving inner an...
research
10/06/2021

Online Hyperparameter Meta-Learning with Hypergradient Distillation

Many gradient-based meta-learning methods assume a set of parameters tha...
research
07/31/2021

Bilevel Optimization for Machine Learning: Algorithm Design and Convergence Analysis

Bilevel optimization has become a powerful framework in various machine ...
research
10/12/2022

Evaluated CMI Bounds for Meta Learning: Tightness and Expressiveness

Recent work has established that the conditional mutual information (CMI...
research
12/13/2022

Multi-objective Tree-structured Parzen Estimator Meets Meta-learning

Hyperparameter optimization (HPO) is essential for the better performanc...

Please sign up or login with your details

Forgot password? Click here to reset