Variational Encoders and Autoencoders : Information-theoretic Inference and Closed-form Solutions

01/27/2021
by   Karthik Duraisamy, et al.
0

This work develops problem statements related to encoders and autoencoders with the goal of elucidating variational formulations and establishing clear connections to information-theoretic concepts. Specifically, four problems with varying levels of input are considered : a) The data, likelihood and prior distributions are given, b) The data and likelihood are given; c) The data and prior are given; d) the data and the dimensionality of the parameters is specified. The first two problems seek encoders (or the posterior) and the latter two seek autoencoders (i.e. the posterior and the likelihood). A variational Bayesian setting is pursued, and detailed derivations are provided for the resulting optimization problem. Following this, a linear Gaussian setting is adopted, and closed form solutions are derived. Numerical experiments are also performed to verify expected behavior and assess convergence properties. Explicit connections are made to rate-distortion theory, information bottleneck theory, and the related concept of sufficiency of statistics is also explored. One of the motivations of this work is to present the theory and learning dynamics associated with variational inference and autoencoders, and to expose information theoretic concepts from a computational science perspective.

READ FULL TEXT

page 4

page 5

page 7

page 9

page 16

page 17

page 24

research
01/31/2020

On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views

This tutorial paper focuses on the variants of the bottleneck problem ta...
research
12/21/2019

Closed Form Variances for Variational Auto-Encoders

We propose a reformulation of Variational Auto-Encoders eliminating half...
research
10/23/2019

Variational Predictive Information Bottleneck

In classic papers, Zellner demonstrated that Bayesian inference could be...
research
09/30/2019

Tightening Bounds for Variational Inference by Revisiting Perturbation Theory

Variational inference has become one of the most widely used methods in ...
research
02/13/2018

The Birthday Problem and Zero-Error List Codes

As an attempt to bridge the gap between classical information theory and...
research
03/30/2018

Understanding Autoencoders with Information Theoretic Concepts

Despite their great success in practical applications, there is still a ...
research
05/15/2020

On the Information Plane of Autoencoders

The training dynamics of hidden layers in deep learning are poorly under...

Please sign up or login with your details

Forgot password? Click here to reset