Detecting Out-of-Distribution Inputs to Deep Generative Models Using a Test for Typicality

06/07/2019
by   Eric Nalisnick, et al.
5

Recent work has shown that deep generative models can assign higher likelihood to out-of-distribution data sets than to their training data. We posit that this phenomenon is caused by a mismatch between the model's typical set and its areas of high probability density. In-distribution inputs should reside in the former but not necessarily in the latter, as previous work has presumed. To determine whether or not inputs reside in the typical set, we propose a statistically principled, easy-to-implement test using the empirical distribution of model likelihoods. The test is model agnostic and widely applicable, only requiring that the likelihood can be computed or closely approximated. We report experiments showing that our procedure can successfully detect the out-of-distribution sets in several of the challenging cases reported by Nalisnick et al. (2019).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2019

Likelihood Assignment for Out-of-Distribution Inputs in Deep Generative Models is Sensitive to Prior Distribution Choice

Recent work has shown that deep generative models assign higher likeliho...
research
09/25/2019

Input complexity and out-of-distribution detection with likelihood-based generative models

Likelihood-based generative models are a promising resource to detect ou...
research
10/22/2018

Do Deep Generative Models Know What They Don't Know?

A neural network deployed in the wild may be asked to make predictions f...
research
11/12/2019

Deep Generative Models Strike Back! Improving Understanding and Evaluation in Light of Unmet Expectations for OoD Data

Advances in deep generative and density models have shown impressive cap...
research
02/25/2023

Data-Copying in Generative Models: A Formal Framework

There has been some recent interest in detecting and addressing memoriza...
research
02/16/2021

Hierarchical VAEs Know What They Don't Know

Deep generative models have shown themselves to be state-of-the-art dens...
research
08/12/2021

DOI: Divergence-based Out-of-Distribution Indicators via Deep Generative Models

To ensure robust and reliable classification results, OoD (out-of-distri...

Please sign up or login with your details

Forgot password? Click here to reset