Generalized Information Bottleneck for Gaussian Variables

03/31/2023
by   Vudtiwat Ngampruetikorn, et al.
0

The information bottleneck (IB) method offers an attractive framework for understanding representation learning, however its applications are often limited by its computational intractability. Analytical characterization of the IB method is not only of practical interest, but it can also lead to new insights into learning phenomena. Here we consider a generalized IB problem, in which the mutual information in the original IB method is replaced by correlation measures based on Renyi and Jeffreys divergences. We derive an exact analytical IB solution for the case of Gaussian correlated variables. Our analysis reveals a series of structural transitions, similar to those previously observed in the original IB case. We find further that although solving the original, Renyi and Jeffreys IB problems yields different representations in general, the structural transitions occur at the same critical tradeoff parameters, and the Renyi and Jeffreys IB solutions perform well under the original IB objective. Our results suggest that formulating the IB method with alternative correlation measures could offer a strategy for obtaining an approximate solution to the original IB problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2017

Gaussian Lower Bound for the Information Bottleneck Limit

The Information Bottleneck (IB) is a conceptual method for extracting th...
research
05/09/2022

The Compound Information Bottleneck Outlook

We formulate and analyze the compound information bottleneck programming...
research
10/03/2021

A Class of Nonbinary Symmetric Information Bottleneck Problems

We study two dual settings of information processing. Let 𝖸→𝖷→𝖶 be a Mar...
research
01/07/2020

Phase Transitions for the Information Bottleneck in Representation Learning

In the Information Bottleneck (IB), when tuning the relative strength be...
research
05/19/2023

Towards understanding neural collapse in supervised contrastive learning with the information bottleneck method

Neural collapse describes the geometry of activation in the final layer ...
research
10/31/2018

An Information-Theoretic Framework for Non-linear Canonical Correlation Analysis

Canonical Correlation Analysis (CCA) is a linear representation learning...
research
05/28/2021

Perturbation Theory for the Information Bottleneck

Extracting relevant information from data is crucial for all forms of le...

Please sign up or login with your details

Forgot password? Click here to reset