Note on the equivalence of hierarchical variational models and auxiliary deep generative models

03/08/2016
by   Niko Brümmer, et al.
0

This note compares two recently published machine learning methods for constructing flexible, but tractable families of variational hidden-variable posteriors. The first method, called "hierarchical variational models" enriches the inference model with an extra variable, while the other, called "auxiliary deep generative models", enriches the generative model instead. We conclude that the two methods are mathematically equivalent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2016

Auxiliary Deep Generative Models

Deep generative models parameterized by neural networks have recently ac...
research
09/06/2022

Unifying Generative Models with GFlowNets

There are many frameworks for deep generative modeling, each often prese...
research
02/16/2023

Understanding the Distillation Process from Deep Generative Models to Tractable Probabilistic Circuits

Probabilistic Circuits (PCs) are a general and unified computational fra...
research
03/12/2015

Deep Unsupervised Learning using Nonequilibrium Thermodynamics

A central problem in machine learning involves modeling complex data-set...
research
09/07/2018

A simple probabilistic deep generative model for learning generalizable disentangled representations from grouped data

The disentangling problem is to discover multiple complex factors of var...
research
02/24/2021

Image Completion via Inference in Deep Generative Models

We consider image completion from the perspective of amortized inference...
research
11/16/2015

Deep Kalman Filters

Kalman Filters are one of the most influential models of time-varying ph...

Please sign up or login with your details

Forgot password? Click here to reset