Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions

05/10/2021
by   Bertrand Charpentier, et al.
0

Uncertainty awareness is crucial to develop reliable machine learning models. In this work, we propose the Natural Posterior Network (NatPN) for fast and high-quality uncertainty estimation for any task where the target distribution belongs to the exponential family. Thus, NatPN finds application for both classification and general regression settings. Unlike many previous approaches, NatPN does not require out-of-distribution (OOD) data at training time. Instead, it leverages Normalizing Flows to fit a single density on a learned low-dimensional and task-dependent latent space. For any input sample, NatPN uses the predicted likelihood to perform a Bayesian update over the target distribution. Theoretically, NatPN assigns high uncertainty far away from training data. Empirically, our extensive experiments on calibration and OOD detection show that NatPN delivers highly competitive performance for classification, regression and count prediction tasks.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 21

06/16/2020

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts

Accurate estimation of aleatoric and epistemic uncertainty is crucial to...
03/06/2020

Dropout Strikes Back: Improved Uncertainty Estimation via Diversity Sampled Implicit Ensembles

Modern machine learning models usually do not extrapolate well, i.e., th...
05/26/2020

Improving Regression Uncertainty Estimates with an Empirical Prior

While machine learning models capable of producing uncertainty estimates...
07/17/2020

Learning Posterior and Prior for Uncertainty Modeling in Person Re-Identification

Data uncertainty in practical person reID is ubiquitous, hence it requir...
06/17/2020

Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness

Bayesian neural networks (BNN) and deep ensembles are principled approac...
03/18/2019

Approximating exponential family models (not single distributions) with a two-network architecture

Recently much attention has been paid to deep generative models, since t...
10/06/2020

Fixing Asymptotic Uncertainty of Bayesian Neural Networks with Infinite ReLU Features

Approximate Bayesian methods can mitigate overconfidence in ReLU network...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.