GILBO: One Metric to Measure Them All

02/13/2018
by   Alexander A. Alemi, et al.
0

We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAEs trained on MNIST and discuss the results.

READ FULL TEXT
research
07/12/2018

Avoiding Latent Variable Collapse With Generative Skip Models

Variational autoencoders (VAEs) learn distributions of high-dimensional ...
research
12/23/2015

Latent Variable Modeling with Diversity-Inducing Mutual Angular Regularization

Latent Variable Models (LVMs) are a large family of machine learning mod...
research
01/18/2021

Mind the Gap when Conditioning Amortised Inference in Sequential Latent-Variable Models

Amortised inference enables scalable learning of sequential latent-varia...
research
06/12/2018

Improving latent variable descriptiveness with AutoGen

Powerful generative models, particularly in Natural Language Modelling, ...
research
08/10/2018

Multi-Channel Stochastic Variational Inference for the Joint Analysis of Heterogeneous Biomedical Data in Alzheimer's Disease

The joint analysis of biomedical data in Alzheimer's Disease (AD) is imp...
research
10/04/2021

A moment-matching metric for latent variable generative models

It can be difficult to assess the quality of a fitted model when facing ...
research
01/20/2022

A Latent-Variable Model for Intrinsic Probing

The success of pre-trained contextualized representations has prompted r...

Please sign up or login with your details

Forgot password? Click here to reset