META-SMGO-Δ: similarity as a prior in black-box optimization

04/30/2023
by   Riccardo Busetto, et al.
0

When solving global optimization problems in practice, one often ends up repeatedly solving problems that are similar to each others. By providing a rigorous definition of similarity, in this work we propose to incorporate the META-learning rationale into SMGO-Δ, a global optimization approach recently proposed in the literature, to exploit priors obtained from similar past experience to efficiently solve new (similar) problems. Through a benchmark numerical example we show the practical benefits of our META-extension of the baseline algorithm, while providing theoretical bounds on its performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2020

BOML: A Modularized Bilevel Optimization Library in Python for Meta Learning

Meta-learning (a.k.a. learning to learn) has recently emerged as a promi...
research
06/06/2019

Adaptive Gradient-Based Meta-Learning Methods

We build a theoretical framework for understanding practical meta-learni...
research
07/16/2019

Meta-Learning for Black-box Optimization

Recently, neural networks trained as optimizers under the "learning to l...
research
03/05/2021

Meta Learning Black-Box Population-Based Optimizers

The no free lunch theorem states that no model is better suited to every...
research
07/30/2020

Bayesian Optimization for Developmental Robotics with Meta-Learning by Parameters Bounds Reduction

In robotics, methods and softwares usually require optimizations of hype...
research
10/04/2012

Learning Heterogeneous Similarity Measures for Hybrid-Recommendations in Meta-Mining

The notion of meta-mining has appeared recently and extends the traditio...
research
06/17/2022

Accelerating numerical methods by gradient-based meta-solving

In science and engineering applications, it is often required to solve s...

Please sign up or login with your details

Forgot password? Click here to reset