On the optimization of hyperparameters in Gaussian process regression with the help of low-order high-dimensional model representation

11/30/2021
by   Sergei Manzhos, et al.
0

When the data are sparse, optimization of hyperparameters of the kernel in Gaussian process regression by the commonly used maximum likelihood estimation (MLE) criterion often leads to overfitting. We show that choosing hyperparameters (in this case, kernel length parameter and regularization parameter) based on a criterion of the completeness of the basis in the corresponding linear regression problem is superior to MLE. We show that this is facilitated by the use of high-dimensional model representation (HDMR) whereby a low-order HDMR representation can provide reliable reference functions and large synthetic test data sets needed for basis parameter optimization even when the original data are few.

READ FULL TEXT
research
12/05/2021

Rectangularization of Gaussian process regression for optimization of hyperparameters

Optimization of hyperparameters of Gaussian process regression (GPR) det...
research
10/20/2022

Scalable Bayesian Transformed Gaussian Processes

The Bayesian transformed Gaussian process (BTG) model, proposed by Kedem...
research
05/25/2016

How priors of initial hyperparameters affect Gaussian process regression models

The hyperparameters in Gaussian process regression (GPR) model with a sp...
research
03/07/2022

Kernel Packet: An Exact and Scalable Algorithm for Gaussian Process Regression with Matérn Correlations

We develop an exact and scalable algorithm for one-dimensional Gaussian ...
research
04/17/2018

Multivariate Gaussian Process Regression for Multiscale Data Assimilation and Uncertainty Reduction

We present a multivariate Gaussian process regression approach for param...

Please sign up or login with your details

Forgot password? Click here to reset