Nested Variational Compression in Deep Gaussian Processes

12/03/2014
by   James Hensman, et al.
0

Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either supervised or unsupervised learning. For tractable inference approximations to the marginal likelihood of the model must be made. The original approach to approximate inference in these models used variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model [Damianou and Lawrence, 2013]. In this paper we extend this idea with a nested variational compression. The resulting lower bound on the likelihood can be easily parallelized or adapted for stochastic variational inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2019

Approximate Variational Inference Based on a Finite Sample of Gaussian Latent Variables

Variational methods are employed in situations where exact Bayesian infe...
research
02/16/2021

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

We propose a lower bound on the log marginal likelihood of Gaussian proc...
research
11/19/2015

Variational Auto-encoded Deep Gaussian Processes

We develop a scalable deep non-parametric generative model by augmenting...
research
02/25/2022

Learning Invariant Weights in Neural Networks

Assumptions about invariances or symmetries in data can significantly in...
research
02/21/2020

Knot Selection in Sparse Gaussian Processes

Knot-based, sparse Gaussian processes have enjoyed considerable success ...
research
06/19/2015

Variational Gaussian Copula Inference

We utilize copulas to constitute a unified framework for constructing an...
research
11/11/2022

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

Gaussian process training decomposes into inference of the (approximate)...

Please sign up or login with your details

Forgot password? Click here to reset