Additive Gaussian Processes

12/19/2011
by   David Duvenaud, et al.
0

We introduce a Gaussian process model of functions which are additive. An additive function is one which decomposes into a sum of low-dimensional functions, each depending on only a subset of the input variables. Additive GPs generalize both Generalized Additive Models, and the standard GP models which use squared-exponential kernels. Hyperparameter learning in this model can be seen as Bayesian Hierarchical Kernel Learning (HKL). We introduce an expressive but tractable parameterization of the kernel function, which allows efficient evaluation of all input interaction terms, whose number is exponential in the input dimension. The additional structure discoverable by this model results in increased interpretability, as well as state-of-the-art predictive power in regression tasks.

READ FULL TEXT
research
06/20/2022

Additive Gaussian Processes Revisited

Gaussian Process (GP) models are a class of flexible non-parametric mode...
research
12/30/2019

Randomly Projected Additive Gaussian Processes for Regression

Gaussian processes (GPs) provide flexible distributions over functions, ...
research
01/31/2016

Additive Approximations in High Dimensional Nonparametric Regression via the SALSA

High dimensional nonparametric regression is an inherently difficult pro...
research
03/05/2023

On Bayesian Generalized Additive Models

Generalized additive models (GAMs) provide a way to blend parametric and...
research
05/17/2022

High-dimensional additive Gaussian processes under monotonicity constraints

We introduce an additive Gaussian process framework accounting for monot...
research
03/14/2022

On Connecting Deep Trigonometric Networks with Deep Gaussian Processes: Covariance, Expressivity, and Neural Tangent Kernel

Deep Gaussian Process as a Bayesian learning model is promising because ...

Please sign up or login with your details

Forgot password? Click here to reset