Vecchia Gaussian Process Ensembles on Internal Representations of Deep Neural Networks

05/26/2023
by   Felix Jimenez, et al.
0

For regression tasks, standard Gaussian processes (GPs) provide natural uncertainty quantification, while deep neural networks (DNNs) excel at representation learning. We propose to synergistically combine these two approaches in a hybrid method consisting of an ensemble of GPs built on the output of hidden layers of a DNN. GP scalability is achieved via Vecchia approximations that exploit nearest-neighbor conditional independence. The resulting deep Vecchia ensemble not only imbues the DNN with uncertainty quantification but can also provide more accurate and robust predictions. We demonstrate the utility of our model on several datasets and carry out experiments to understand the inner workings of the proposed method.

READ FULL TEXT

page 6

page 8

research
11/03/2020

Uncertainty Quantification of Darcy Flow through Porous Media using Deep Gaussian Process

A computational method based on the non-linear Gaussian process (GP), kn...
research
09/20/2021

Trust Your Robots! Predictive Uncertainty Estimation of Neural Networks with Sparse Gaussian Processes

This paper presents a probabilistic framework to obtain both reliable an...
research
11/09/2022

REDS: Random Ensemble Deep Spatial prediction

There has been a great deal of recent interest in the development of spa...
research
08/29/2021

Uncertainty quantification for multiclass data description

In this manuscript, we propose a multiclass data description model based...
research
06/22/2018

Neural-net-induced Gaussian process regression for function approximation and PDE solution

Neural-net-induced Gaussian process (NNGP) regression inherits both the ...
research
11/05/2019

Scalable Variational Gaussian Processes for Crowdsourcing: Glitch Detection in LIGO

In the last years, crowdsourcing is transforming the way classification ...

Please sign up or login with your details

Forgot password? Click here to reset