Calibrating Deep Convolutional Gaussian Processes

by   Gia-Lac Tran, et al.

The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions. Previous work on combining CNNs with Gaussian processes (GPs) has been developed under the assumption that the predictive probabilities of these models are well-calibrated. In this paper we show that, in fact, current combinations of CNNs and GPs are miscalibrated. We proposes a novel combination that considerably outperforms previous approaches on this aspect, while achieving state-of-the-art performance on image classification tasks.


page 1

page 2

page 3

page 4


Deep convolutional Gaussian processes

We propose deep convolutional Gaussian processes, a deep Gaussian proces...

Translation Insensitivity for Deep Convolutional Gaussian Processes

Deep learning has been at the foundation of large improvements in image ...

Deep Gaussian Processes with Decoupled Inducing Inputs

Deep Gaussian Processes (DGP) are hierarchical generalizations of Gaussi...

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

There is a previously identified equivalence between wide fully connecte...

Robustness Quantification for Classification with Gaussian Processes

We consider Bayesian classification with Gaussian processes (GPs) and de...

Learning in the Wild with Incremental Skeptical Gaussian Processes

The ability to learn from human supervision is fundamental for personal ...

Multimodal Neural Processes for Uncertainty Estimation

Neural processes (NPs) have brought the representation power of parametr...

Please sign up or login with your details

Forgot password? Click here to reset