Dual Parameterization of Sparse Variational Gaussian Processes

11/05/2021
by   Vincent Adam, et al.
0

Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits. In this paper, we improve their computational efficiency by using a dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation. Our dual parameterization speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning. The approach has the same memory cost as the current SVGP methods, but it is faster and more accurate.

READ FULL TEXT
research
06/03/2023

Variational Gaussian Process Diffusion Processes

Diffusion processes are a class of stochastic differential equations (SD...
research
03/24/2018

Natural Gradients in Practice: Non-Conjugate Variational Inference in Gaussian Process Models

The natural gradient method has been used effectively in conjugate Gauss...
research
11/11/2022

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

Gaussian process training decomposes into inference of the (approximate)...
research
11/21/2016

Variational Fourier features for Gaussian processes

This work brings together two powerful concepts in Gaussian processes: t...
research
11/02/2017

Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation

Modeling sequential data has become more and more important in practice....
research
11/22/2019

A Fully Natural Gradient Scheme for Improving Inference of the Heterogeneous Multi-Output Gaussian Process Model

A recent novel extension of multi-output Gaussian processes handles hete...
research
10/29/2020

Gaussian Process Bandit Optimization of the Thermodynamic Variational Objective

Achieving the full promise of the Thermodynamic Variational Objective (T...

Please sign up or login with your details

Forgot password? Click here to reset