DeepAI AI Chat
Log In Sign Up

High Mutual Information in Representation Learning with Symmetric Variational Inference

10/04/2019
by   Micha Livne, et al.
Google
UNIVERSITY OF TORONTO
18

We introduce the Mutual Information Machine (MIM), a novel formulation of representation learning, using a joint distribution over the observations and latent state in an encoder/decoder framework. Our key principles are symmetry and mutual information, where symmetry encourages the encoder and decoder to learn different factorizations of the same underlying distribution, and mutual information, to encourage the learning of useful representations for downstream tasks. Our starting point is the symmetric Jensen-Shannon divergence between the encoding and decoding joint distributions, plus a mutual information encouraging regularizer. We show that this can be bounded by a tractable cross entropy loss function between the true model and a parameterized approximation, and relate this to the maximum likelihood framework. We also relate MIM to variational autoencoders (VAEs) and demonstrate that MIM is capable of learning symmetric factorizations, with high mutual information that avoids posterior collapse.

READ FULL TEXT

page 11

page 12

10/08/2019

MIM: Mutual Information Machine

We introduce the Mutual Information Machine (MIM), an autoencoder model ...
04/08/2020

Learning Discrete Structured Representations by Adversarially Maximizing Mutual Information

We propose learning discrete structured representations from unlabeled d...
02/25/2020

A Theory of Usable Information Under Computational Constraints

We propose a new framework for reasoning about information in complex sy...
02/07/2020

Inverse Learning of Symmetry Transformations

Symmetry transformations induce invariances and are a crucial building b...
06/07/2017

InfoVAE: Information Maximizing Variational Autoencoders

It has been previously observed that variational autoencoders tend to ig...
09/11/2020

Capacity-Approaching Autoencoders for Communications

The autoencoder concept has fostered the reinterpretation and the design...
05/14/2022

MIND: Maximum Mutual Information Based Neural Decoder

We are assisting at a growing interest in the development of learning ar...