Calibrating Deep Convolutional Gaussian Processes

05/26/2018
by   Gia-Lac Tran, et al.
0

The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions. Previous work on combining CNNs with Gaussian processes (GPs) has been developed under the assumption that the predictive probabilities of these models are well-calibrated. In this paper we show that, in fact, current combinations of CNNs and GPs are miscalibrated. We proposes a novel combination that considerably outperforms previous approaches on this aspect, while achieving state-of-the-art performance on image classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2018

Deep convolutional Gaussian processes

We propose deep convolutional Gaussian processes, a deep Gaussian proces...
research
02/15/2019

Translation Insensitivity for Deep Convolutional Gaussian Processes

Deep learning has been at the foundation of large improvements in image ...
research
01/09/2018

Deep Gaussian Processes with Decoupled Inducing Inputs

Deep Gaussian Processes (DGP) are hierarchical generalizations of Gaussi...
research
10/11/2018

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

There is a previously identified equivalence between wide fully connecte...
research
05/28/2019

Robustness Quantification for Classification with Gaussian Processes

We consider Bayesian classification with Gaussian processes (GPs) and de...
research
11/02/2020

Learning in the Wild with Incremental Skeptical Gaussian Processes

The ability to learn from human supervision is fundamental for personal ...
research
04/04/2023

Multimodal Neural Processes for Uncertainty Estimation

Neural processes (NPs) have brought the representation power of parametr...

Please sign up or login with your details

Forgot password? Click here to reset