A Similarity Measure of Gaussian Process Predictive Distributions

01/20/2021
by   Lucia Asencio-Martín, et al.
0

Some scenarios require the computation of a predictive distribution of a new value evaluated on an objective function conditioned on previous observations. We are interested on using a model that makes valid assumptions on the objective function whose values we are trying to predict. Some of these assumptions may be smoothness or stationarity. Gaussian process (GPs) are probabilistic models that can be interpreted as flexible distributions over functions. They encode the assumptions through covariance functions, making hypotheses about new data through a predictive distribution by being fitted to old observations. We can face the case where several GPs are used to model different objective functions. GPs are non-parametric models whose complexity is cubic on the number of observations. A measure that represents how similar is one GP predictive distribution with respect to another would be useful to stop using one GP when they are modelling functions of the same input space. We are really inferring that two objective functions are correlated, so one GP is enough to model both of them by performing a transformation of the prediction of the other function in case of inverse correlation. We show empirical evidence in a set of synthetic and benchmark experiments that GPs predictive distributions can be compared and that one of them is enough to predict two correlated functions in the same input space. This similarity metric could be extremely useful used to discard objectives in Bayesian many-objective optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2021

Many Objective Bayesian Optimization

Some real problems require the evaluation of expensive and noisy objecti...
research
10/14/2022

Bayesian Regularization on Function Spaces via Q-Exponential Process

Regularization is one of the most important topics in optimization, stat...
research
05/24/2023

Explaining the Uncertain: Stochastic Shapley Values for Gaussian Process Models

We present a novel approach for explaining Gaussian processes (GPs) that...
research
06/23/2019

Compositionally-Warped Gaussian Processes

The Gaussian process (GP) is a nonparametric prior distribution over fun...
research
12/09/2020

Disentangling Derivatives, Uncertainty and Error in Gaussian Process Models

Gaussian Processes (GPs) are a class of kernel methods that have shown t...
research
06/07/2022

Relaxed Gaussian process interpolation: a goal-oriented approach to Bayesian optimization

This work presents a new procedure for obtaining predictive distribution...
research
07/04/2018

Neural Processes

A neural network (NN) is a parameterised function that can be tuned via ...

Please sign up or login with your details

Forgot password? Click here to reset