Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference

09/30/2019
by   Max-Heinrich Laves, et al.
0

In this paper, well-calibrated model uncertainty is obtained by using temperature scaling together with Monte Carlo dropout as approximation to Bayesian inference. The proposed approach can easily be derived from frequentist temperature scaling and yields well-calibrated model uncertainty as well as softmax likelihood.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2020

Calibration of Model Uncertainty for Dropout Variational Inference

The model uncertainty obtained by variational Bayesian inference with Mo...
research
11/18/2022

Layer-Stack Temperature Scaling

Recent works demonstrate that early layers in a neural network contain u...
research
04/27/2022

Dropout Inference with Non-Uniform Weight Scaling

Dropout as regularization has been used extensively to prevent overfitti...
research
01/21/2019

Calibration with Bias-Corrected Temperature Scaling Improves Domain Adaptation Under Label Shift in Modern Neural Networks

Label shift refers to the phenomenon where the marginal probability p(y)...
research
09/14/2022

Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation

Out-of-Domain (OOD) detection is a key component in a task-oriented dial...
research
04/30/2021

Inference and model determination for Temperature-Driven non-linear Ecological Models

This paper is concerned with a contemporary Bayesian approach to the eff...
research
12/23/2020

Testing whether a Learning Procedure is Calibrated

A learning procedure takes as input a dataset and performs inference for...

Please sign up or login with your details

Forgot password? Click here to reset