Likelihood Assignment for Out-of-Distribution Inputs in Deep Generative Models is Sensitive to Prior Distribution Choice

11/15/2019
by   Ryo Kamoi, et al.
13

Recent work has shown that deep generative models assign higher likelihood to out-of-distribution inputs than to training data. We show that a factor underlying this phenomenon is a mismatch between the nature of the prior distribution and that of the data distribution, a problem found in widely used deep generative models such as VAEs and Glow. While a typical choice for a prior distribution is a standard Gaussian distribution, properties of distributions of real data sets may not be consistent with a unimodal prior distribution. This paper focuses on the relationship between the choice of a prior distribution and the likelihoods assigned to out-of-distribution inputs. We propose the use of a mixture distribution as a prior to make likelihoods assigned by deep generative models sensitive to out-of-distribution inputs. Furthermore, we explain the theoretical advantages of adopting a mixture distribution as the prior, and we present experimental results to support our claims. Finally, we demonstrate that a mixture prior lowers the out-of-distribution likelihood with respect to two pairs of real image data sets: Fashion-MNIST vs. MNIST and CIFAR10 vs. SVHN.

READ FULL TEXT

page 3

page 14

page 15

research
06/07/2019

Detecting Out-of-Distribution Inputs to Deep Generative Models Using a Test for Typicality

Recent work has shown that deep generative models can assign higher like...
research
07/14/2021

Understanding Failures in Out-of-Distribution Detection with Deep Generative Models

Deep generative models (DGMs) seem a natural fit for detecting out-of-di...
research
11/12/2019

Deep Generative Models Strike Back! Improving Understanding and Evaluation in Light of Unmet Expectations for OoD Data

Advances in deep generative and density models have shown impressive cap...
research
07/28/2021

Bayesian Autoencoders: Analysing and Fixing the Bernoulli likelihood for Out-of-Distribution Detection

After an autoencoder (AE) has learnt to reconstruct one dataset, it migh...
research
06/30/2021

On the Generative Utility of Cyclic Conditionals

We study whether and how can we model a joint distribution p(x,z) using ...
research
12/18/2018

A Factorial Mixture Prior for Compositional Deep Generative Models

We assume that a high-dimensional datum, like an image, is a composition...
research
09/22/2021

Entropic Issues in Likelihood-Based OOD Detection

Deep generative models trained by maximum likelihood remain very popular...

Please sign up or login with your details

Forgot password? Click here to reset