Information Bottleneck on General Alphabets

01/03/2018
by   Georg Pichler, et al.
0

We prove a source coding theorem that can probably be considered folklore, a generalization to arbitrary alphabets of a problem motivated by the Information Bottleneck method. For general random variables (Y, X), we show essentially that for some n ∈N, a function f with rate limit |f| < nR and I(Y^n; f(X^n)) > nS exists if and only if there is a discrete random variable U such that the Markov chain Y - X - U holds, I(U; X) < R and I(U; Y) < S.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2018

On some Limit Theorem for Markov Chain

The goal of this paper is to describe conditions which guarantee a centr...
research
05/06/2017

Nonlinear Information Bottleneck

Information bottleneck [IB] is a technique for extracting information in...
research
02/16/2018

Generalizing Bottleneck Problems

Given a pair of random variables (X,Y)∼ P_XY and two convex functions f_...
research
04/17/2019

Bottleneck potentials in Markov Random Fields

We consider general discrete Markov Random Fields(MRFs) with additional ...
research
01/10/2013

Wyner's Common Information: Generalizations and A New Lossy Source Coding Interpretation

Wyner's common information was originally defined for a pair of dependen...
research
10/26/2022

Extracting Unique Information Through Markov Relations

We propose two new measures for extracting the unique information in X a...
research
01/10/2013

Multivariate Information Bottleneck

The Information bottleneck method is an unsupervised non-parametric data...

Please sign up or login with your details

Forgot password? Click here to reset