Decomposed Mutual Information Estimation for Contrastive Representation Learning

06/25/2021
by   Alessandro Sordoni, et al.
0

Recent contrastive representation learning methods rely on estimating mutual information (MI) between multiple views of an underlying context. E.g., we can derive multiple views of a given image by applying data augmentation, or we can split a sequence into views comprising the past and future of some step in the sequence. Contrastive lower bounds on MI are easy to optimize, but have a strong underestimation bias when estimating large amounts of MI. We propose decomposing the full MI estimation problem into a sum of smaller estimation problems by splitting one of the views into progressively more informed subviews and by applying the chain rule on MI between the decomposed views. This expression contains a sum of unconditional and conditional MI terms, each measuring modest chunks of the total MI, which facilitates approximation via contrastive bounds. To maximize the sum, we formulate a contrastive lower bound on the conditional MI which can be approximated efficiently. We refer to our general approach as Decomposed Estimation of Mutual Information (DEMI). We show that DEMI can capture a larger amount of MI than standard non-decomposed contrastive bounds in a synthetic setting, and learns better representations in a vision domain and for dialogue generation.

READ FULL TEXT

page 2

page 18

research
05/27/2020

On Mutual Information in Contrastive Learning for Visual Representations

In recent years, several unsupervised, "contrastive" learning algorithms...
research
11/18/2021

Contrastive Multiview Coding for Enzyme-Substrate Interaction Prediction

Characterizing Enzyme function is an important requirement for predictin...
research
10/05/2020

Conditional Negative Sampling for Contrastive Learning of Visual Representations

Recent methods for learning unsupervised visual representations, dubbed ...
research
08/30/2023

Towards a Rigorous Analysis of Mutual Information in Contrastive Learning

Contrastive learning has emerged as a cornerstone in recent achievements...
research
06/03/2019

Learning Representations by Maximizing Mutual Information Across Views

We propose an approach to self-supervised representation learning based ...
research
02/16/2021

Learning Invariant Representations using Inverse Contrastive Loss

Learning invariant representations is a critical first step in a number ...
research
04/26/2023

Understanding the limitation of Total Correlation Estimation Based on Mutual Information Bounds

The total correlation(TC) is a crucial index to measure the correlation ...

Please sign up or login with your details

Forgot password? Click here to reset