Function Optimization with Posterior Gaussian Derivative Process

10/26/2020
by   Sucharita Roy, et al.
0

In this article, we propose and develop a novel Bayesian algorithm for optimization of functions whose first and second partial derivatives are known. The basic premise is the Gaussian process representation of the function which induces a first derivative process that is also Gaussian. The Bayesian posterior solutions of the derivative process set equal to zero, given data consisting of suitable choices of input points in the function domain and their function values, emulate the stationary points of the function, which can be fine-tuned by setting restrictions on the prior in terms of the first and second derivatives of the objective function. These observations motivate us to propose a general and effective algorithm for function optimization that attempts to get closer to the true optima adaptively with in-built iterative stages. We provide theoretical foundation to this algorithm, proving almost sure convergence to the true optima as the number of iterative stages tends to infinity. The theoretical foundation hinges upon our proofs of almost sure uniform convergence of the posteriors associated with Gaussian and Gaussian derivative processes to the underlying function and its derivatives in appropriate fixed-domain infill asymptotics setups; rates of convergence are also available. We also provide Bayesian characterization of the number of optima using information inherent in our optimization algorithm. We illustrate our Bayesian optimization algorithm with five different examples involving maxima, minima, saddle points and even inconclusiveness. Our examples range from simple, one-dimensional problems to challenging 50 and 100-dimensional problems.

READ FULL TEXT

page 20

page 22

page 23

page 24

page 27

page 28

research
03/31/2017

Exploiting gradients and Hessians in Bayesian optimization and Bayesian quadrature

An exciting branch of machine learning research focuses on methods for l...
research
07/08/2021

Scaling Gaussian Processes with Derivative Information Using Variational Inference

Gaussian processes with derivative information are useful in many settin...
research
09/16/2020

Bayesian Inference for Stationary Points in Gaussian Process Regression Models for Event-Related Potentials Analysis

Stationary points embedded in the derivatives are often critical for a m...
research
01/08/2021

Bayesian optimization with improved scalability and derivative information for efficient design of nanophotonic structures

We propose the combination of forward shape derivatives and the use of a...
research
04/11/2022

Hermite-type modifications of BOBYQA for optimization with some partial derivatives

In this work we propose two Hermite-type optimization methods, Hermite l...
research
06/07/2022

Relaxed Gaussian process interpolation: a goal-oriented approach to Bayesian optimization

This work presents a new procedure for obtaining predictive distribution...

Please sign up or login with your details

Forgot password? Click here to reset