Convolutional Normalizing Flows for Deep Gaussian Processes

04/17/2021
by   Haibin Yu, et al.
0

Deep Gaussian processes (DGPs), a hierarchical composition of GP models, have successfully boosted the expressive power than the single-layer counterpart. However, it is impossible to perform exact inference in DGPs, which has motivated the recent development of variational inference based methods. Unfortunately, these methods either yield a biased posterior belief or are difficult to evaluate the convergence. This paper, on the contrary, introduces a new approach for specifying flexible, arbitrarily complex, and scalable approximate posterior distributions. The posterior distribution is constructed through a normalizing flow (NF) which transforms a simple initial probability into a more complex one through a sequence of invertible transformations. Moreover, a novel convolutional normalizing flow (CNF) is developed to improve the time efficiency and capture dependency between layers. Empirical evaluation demonstrates that CNF DGP outperforms the state-of-the-art approximation methods for DGPs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2019

Implicit Posterior Variational Inference for Deep Gaussian Processes

A multi-layer deep Gaussian process (DGP) model is a hierarchical compos...
research
05/21/2015

Variational Inference with Normalizing Flows

The choice of approximate posterior distribution is one of the core prob...
research
06/14/2018

Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo

Deep Gaussian Processes (DGPs) are hierarchical generalizations of Gauss...
research
09/17/2019

Compositional uncertainty in deep Gaussian processes

Gaussian processes (GPs) are nonparametric priors over functions, and fi...
research
05/22/2020

Beyond the Mean-Field: Structured Deep Gaussian Processes Improve the Predictive Uncertainties

Deep Gaussian Processes learn probabilistic data representations for sup...
research
02/05/2021

Reducing the Amortization Gap in Variational Autoencoders: A Bayesian Random Function Approach

Variational autoencoder (VAE) is a very successful generative model whos...
research
12/10/2018

Model-Based Learning of Turbulent Flows using Mobile Robots

In this paper we consider the problem of model-based learning of turbule...

Please sign up or login with your details

Forgot password? Click here to reset