
On some Limit Theorem for Markov Chain
The goal of this paper is to describe conditions which guarantee a centr...
read it

Nonlinear Information Bottleneck
Information bottleneck [IB] is a technique for extracting information in...
read it

Generalizing Bottleneck Problems
Given a pair of random variables (X,Y)∼ P_XY and two convex functions f_...
read it

Bottleneck potentials in Markov Random Fields
We consider general discrete Markov Random Fields(MRFs) with additional ...
read it

Wyner's Common Information: Generalizations and A New Lossy Source Coding Interpretation
Wyner's common information was originally defined for a pair of dependen...
read it

Multivariate Information Bottleneck
The Information bottleneck method is an unsupervised nonparametric data...
read it

Beyond the Central Limit Theorem: Universal and Nonuniversal Simulations of Random Variables by General Mappings
The Central Limit Theorem states that a standard Gaussian random variabl...
read it
Information Bottleneck on General Alphabets
We prove a source coding theorem that can probably be considered folklore, a generalization to arbitrary alphabets of a problem motivated by the Information Bottleneck method. For general random variables (Y, X), we show essentially that for some n ∈N, a function f with rate limit f < nR and I(Y^n; f(X^n)) > nS exists if and only if there is a discrete random variable U such that the Markov chain Y  X  U holds, I(U; X) < R and I(U; Y) < S.
READ FULL TEXT
Comments
There are no comments yet.