CLUB: A Contrastive Log-ratio Upper Bound of Mutual Information

06/22/2020
by   Pengyu Cheng, et al.
38

Mutual information (MI) minimization has gained considerable interests in various machine learning tasks. However, estimating and minimizing MI in high-dimensional spaces remains a challenging problem, especially when only samples, rather than distribution forms, are accessible. Previous works mainly focus on MI lower bound approximation, which is not applicable to MI minimization problems. In this paper, we propose a novel Contrastive Log-ratio Upper Bound (CLUB) of mutual information. We provide a theoretical analysis of the properties of CLUB and its variational approximation. Based on this upper bound, we introduce an accelerated MI minimization training scheme, which bridges MI minimization with negative sampling. Simulation studies on Gaussian and Bernoulli distributions show the reliable estimation ability of CLUB. Real-world MI minimization experiments, including domain adaptation and information bottleneck, further demonstrate the effectiveness of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/04/2018

A Tight Upper Bound on Mutual Information

We derive a tight lower bound on equivocation (conditional entropy), or ...
05/27/2021

Rethinking InfoNCE: How Many Negative Samples Do You Need?

InfoNCE loss is a widely used loss function for contrastive model traini...
06/18/2020

Joint Contrastive Learning for Unsupervised Domain Adaptation

Enhancing feature transferability by matching marginal distributions has...
08/17/2022

Disentangled Speaker Representation Learning via Mutual Information Minimization

Domain mismatch problem caused by speaker-unrelated feature has been a m...
05/08/2019

Data-Efficient Mutual Information Neural Estimator

Measuring Mutual Information (MI) between high-dimensional, continuous, ...
07/02/2018

Gaussian Signalling for Covert Communications

In this work, we examine the optimality of Gaussian signalling for cover...
04/28/2023

Recognizable Information Bottleneck

Information Bottlenecks (IBs) learn representations that generalize to u...

Code Repositories

CLUB

Code for ICML2020 paper - CLUB: A Contrastive Log-ratio Upper Bound of Mutual Information


view repo

Please sign up or login with your details

Forgot password? Click here to reset