Multiple Gaussian Process Models

10/24/2011
by   Cedric Archambeau, et al.
0

We consider a Gaussian process formulation of the multiple kernel learning problem. The goal is to select the convex combination of kernel matrices that best explains the data and by doing so improve the generalisation on unseen data. Sparsity in the kernel weights is obtained by adopting a hierarchical Bayesian approach: Gaussian process priors are imposed over the latent functions and generalised inverse Gaussians on their associated weights. This construction is equivalent to imposing a product of heavy-tailed process priors over function space. A variational inference algorithm is derived for regression and binary classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2017

Variational Inference for Gaussian Process Models with Linear Complexity

Large-scale Gaussian process inference has long faced practical challeng...
research
08/28/2015

Varying-coefficient models with isotropic Gaussian process priors

We study learning problems in which the conditional distribution of the ...
research
10/16/2018

Multimodal Deep Gaussian Processes

We propose a novel Bayesian approach to modelling multimodal data genera...
research
02/13/2019

Efficient Bayesian shape-restricted function estimation with constrained Gaussian process priors

This article revisits the problem of Bayesian shape-restricted inference...
research
11/14/2020

Factorized Gaussian Process Variational Autoencoders

Variational autoencoders often assume isotropic Gaussian priors and mean...
research
03/09/2011

A Kernel Approach to Tractable Bayesian Nonparametrics

Inference in popular nonparametric Bayesian models typically relies on s...
research
07/21/2021

Online structural kernel selection for mobile health

Motivated by the need for efficient and personalized learning in mobile ...

Please sign up or login with your details

Forgot password? Click here to reset