Robust Deep Gaussian Processes

04/04/2019
by   Jeremias Knoblauch, et al.
0

This report provides an in-depth overview over the implications and novelty Generalized Variational Inference (GVI) (Knoblauch et al., 2019) brings to Deep Gaussian Processes (DGPs) (Damianou & Lawrence, 2013). Specifically, robustness to model misspecification as well as principled alternatives for uncertainty quantification are motivated with an information-geometric view. These modifications have clear interpretations and can be implemented in less than 100 lines of Python code. Most importantly, the corresponding empirical results show that DGPs can greatly benefit from the presented enhancements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2022

Generalized Variational Inference in Function Spaces: Gaussian Measures meet Bayesian Deep Learning

We develop a framework for generalized variational inference in infinite...
research
10/14/2016

Random Feature Expansions for Deep Gaussian Processes

The composition of multiple Gaussian Processes as a Deep Gaussian Proces...
research
05/14/2019

Deep Gaussian Processes with Importance-Weighted Variational Inference

Deep Gaussian processes (DGPs) can model complex marginal densities as w...
research
09/17/2019

Compositional uncertainty in deep Gaussian processes

Gaussian processes (GPs) are nonparametric priors over functions, and fi...
research
11/30/2017

How Deep Are Deep Gaussian Processes?

Recent research has shown the potential utility of probability distribut...
research
03/06/2020

Scalable Uncertainty for Computer Vision with Functional Variational Inference

As Deep Learning continues to yield successful applications in Computer ...

Please sign up or login with your details

Forgot password? Click here to reset