DeepAI AI Chat
Log In Sign Up

Active Bayesian Optimization: Minimizing Minimizer Entropy

02/09/2012
by   Il Memming Park, et al.
The University of Texas at Austin
0

The ultimate goal of optimization is to find the minimizer of a target function.However, typical criteria for active optimization often ignore the uncertainty about the minimizer. We propose a novel criterion for global optimization and an associated sequential active learning strategy using Gaussian processes.Our criterion is the reduction of uncertainty in the posterior distribution of the function minimizer. It can also flexibly incorporate multiple global minimizers. We implement a tractable approximation of the criterion and demonstrate that it obtains the global minimizer accurately compared to conventional Bayesian optimization criteria.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/21/2023

Self-Correcting Bayesian Optimization through Bayesian Active Learning

Gaussian processes are cemented as the model of choice in Bayesian optim...
03/09/2021

A sampling criterion for constrained Bayesian optimization with uncertainties

We consider the problem of chance constrained optimization where it is s...
11/20/2020

Design of Experiments for Verifying Biomolecular Networks

There is a growing trend in molecular and synthetic biology of using mec...
11/21/2015

Near-Optimal Active Learning of Multi-Output Gaussian Processes

This paper addresses the problem of active learning of a multi-output Ga...
03/02/2023

Active Learning and Bayesian Optimization: a Unified Perspective to Learn with a Goal

Both Bayesian optimization and active learning realize an adaptive sampl...
11/24/2015

Stopping criteria for boosting automatic experimental design using real-time fMRI with Bayesian optimization

Bayesian optimization has been proposed as a practical and efficient too...
07/30/2020

Stopping Criterion Design for Recursive Bayesian Classification: Analysis and Decision Geometry

Systems that are based on recursive Bayesian updates for classification ...