MIM: Mutual Information Machine

10/08/2019
by   Micha Livne, et al.
47

We introduce the Mutual Information Machine (MIM), an autoencoder model for learning joint distributions over observations and latent states. The model formulation reflects two key design principles: 1) symmetry, to encourage the encoder and decoder to learn consistent factorizations of the same underlying distribution; and 2) mutual information, to encourage the learning of useful representations for downstream tasks. The objective comprises the Jensen-Shannon divergence between the encoding and decoding joint distributions, plus a mutual information term. We show that this objective can be bounded by a tractable cross-entropy loss between the true model and a parameterized approximation, and relate this to maximum likelihood estimation and variational autoencoders. Experiments show that MIM is capable of learning a latent representation with high mutual information, and good unsupervised clustering, while providing data log likelihood comparable to VAE (with a sufficiently expressive architecture).

READ FULL TEXT

page 10

page 24

page 25

page 26

page 27

research
10/04/2019

High Mutual Information in Representation Learning with Symmetric Variational Inference

We introduce the Mutual Information Machine (MIM), a novel formulation o...
research
05/28/2020

VMI-VAE: Variational Mutual Information Maximization Framework for VAE With Discrete and Continuous Priors

Variational Autoencoder is a scalable method for learning latent variabl...
research
04/08/2020

Learning Discrete Structured Representations by Adversarially Maximizing Mutual Information

We propose learning discrete structured representations from unlabeled d...
research
12/01/2018

Learning Speaker Representations with Mutual Information

Learning good representations is of crucial importance in deep learning....
research
05/23/2023

Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning

A new trend in deep learning, represented by Mutual Information Neural E...
research
07/06/2021

InfoNCE is a variational autoencoder

We show that a popular self-supervised learning method, InfoNCE, is a sp...
research
05/14/2022

MIND: Maximum Mutual Information Based Neural Decoder

We are assisting at a growing interest in the development of learning ar...

Please sign up or login with your details

Forgot password? Click here to reset