Variational Autoencoders and Nonlinear ICA: A Unifying Framework

07/10/2019
by   Ilyes Khemakhem, et al.
2

The framework of variational autoencoders allows us to efficiently learn deep latent-variable models, such that the model's marginal distribution over observed variables fits the data. Often, we're interested in going a step further, and want to approximate the true joint distribution over observed and latent variables, including the true prior and posterior distributions over latent variables. This is known to be generally impossible due to unidentifiability of the model. We address this issue by showing that for a broad family of deep latent-variable models, identification of the true joint distribution over observed and latent variables is actually possible up to a simple transformation, thus achieving a principled and powerful form of disentanglement. Our result requires a factorized prior distribution over the latent variables that is conditioned on an additionally observed variable, such as a class label or almost any other observation. We build on recent developments in nonlinear ICA, which we extend to the case with noisy, undercomplete or discrete observations, integrated in a maximum likelihood framework. The result also trivially contains identifiable flow-based generative models as a special case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2018

Avoiding Latent Variable Collapse With Generative Skip Models

Variational autoencoders (VAEs) learn distributions of high-dimensional ...
research
10/19/2020

Learning Exponential Family Graphical Models with Latent Variables using Regularized Conditional Likelihood

Fitting a graphical model to a collection of random variables given samp...
research
09/13/2022

Unsupervised representational learning with recognition-parametrised probabilistic models

We introduce a new approach to probabilistic unsupervised learning based...
research
01/14/2020

Disentanglement by Nonlinear ICA with General Incompressible-flow Networks (GIN)

A central question of representation learning asks under which condition...
research
02/22/2020

Amortised Learning by Wake-Sleep

Models that employ latent variables to capture structure in observed dat...
research
06/14/2016

Recursive nonlinear-system identification using latent variables

In this paper we develop a method for learning nonlinear systems with mu...
research
11/18/2020

Detecting Hierarchical Changes in Latent Variable Models

This paper addresses the issue of detecting hierarchical changes in late...

Please sign up or login with your details

Forgot password? Click here to reset