Deep Gaussian Covariance Network

10/17/2017
by   Kevin Cremanns, et al.
0

The correlation length-scale next to the noise variance are the most used hyperparameters for the Gaussian processes. Typically, stationary covariance functions are used, which are only dependent on the distances between input points and thus invariant to the translations in the input space. The optimization of the hyperparameters is commonly done by maximizing the log marginal likelihood. This works quite well, if the distances are uniform distributed. In the case of a locally adapted or even sparse input space, the prediction of a test point can be worse dependent of its position. A possible solution to this, is the usage of a non-stationary covariance function, where the hyperparameters are calculated by a deep neural network. So that the correlation length scales and possibly the noise variance are dependent on the test point. Furthermore, different types of covariance functions are trained simultaneously, so that the Gaussian process prediction is an additive overlay of different covariance matrices. The right covariance functions combination and its hyperparameters are learned by the deep neural network. Additional, the Gaussian process will be able to be trained by batches or online and so it can handle arbitrarily large data sets. We call this framework Deep Gaussian Covariance Network (DGCP). There are also further extensions to this framework possible, for example sequentially dependent problems like time series or the local mixture of experts. The basic framework and some extension possibilities will be presented in this work. Moreover, a comparison to some recent state of the art surrogate model methods will be performed, also for a time dependent problem.

READ FULL TEXT
research
06/20/2022

Noise Estimation in Gaussian Process Regression

We develop a computational procedure to estimate the covariance hyperpar...
research
09/11/2018

Efficient Global Optimization using Deep Gaussian Processes

Efficient Global Optimization (EGO) is widely used for the optimization ...
research
05/23/2018

Trans-Gaussian Kriging in a Bayesian framework : a case study

In the context of Gaussian Process Regression or Kriging, we propose a f...
research
02/06/2015

Marginalizing Gaussian Process Hyperparameters using Sequential Monte Carlo

Gaussian process regression is a popular method for non-parametric proba...
research
05/24/2019

Sequential Gaussian Processes for Online Learning of Nonstationary Functions

Many machine learning problems can be framed in the context of estimatin...
research
12/01/2021

A Novel Gaussian Process Based Ground Segmentation Algorithm with Local-Smoothness Estimation

Autonomous Land Vehicles (ALV) shall efficiently recognize the ground in...
research
02/27/2019

Local Bandwidth Estimation via Mixture of Gaussian Processes

Real world data often exhibit inhomogeneity - complexity of the target f...

Please sign up or login with your details

Forgot password? Click here to reset