A Class of Nonbinary Symmetric Information Bottleneck Problems

10/03/2021
βˆ™
by   Michael Dikshtein, et al.
βˆ™
0
βˆ™

We study two dual settings of information processing. Let 𝖸→𝖷→𝖢 be a Markov chain with fixed joint probability mass function 𝖯_𝖷𝖸 and a mutual information constraint on the pair (𝖢,𝖷). For the first problem, known as Information Bottleneck, we aim to maximize the mutual information between the random variables 𝖸 and 𝖢, while for the second problem, termed as Privacy Funnel, our goal is to minimize it. In particular, we analyze the scenario for which 𝖷 is the input, and 𝖸 is the output of modulo-additive noise channel. We provide analytical characterization of the optimal information rates and the achieving distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
βˆ™ 05/09/2022

The Compound Information Bottleneck Outlook

We formulate and analyze the compound information bottleneck programming...
research
βˆ™ 03/06/2018

Deep Information Networks

We describe a novel classifier with a tree structure, designed using inf...
research
βˆ™ 12/07/2018

Information-Distilling Quantizers

Let X and Y be dependent random variables. This paper considers the prob...
research
βˆ™ 05/19/2019

Minimal Achievable Sufficient Statistic Learning

We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a ...
research
βˆ™ 02/16/2018

Generalizing Bottleneck Problems

Given a pair of random variables (X,Y)∼ P_XY and two convex functions f_...
research
βˆ™ 08/27/2022

On the Quadratic Decaying Property of the Information Rate Function

The quadratic decaying property of the information rate function states ...
research
βˆ™ 03/31/2023

Generalized Information Bottleneck for Gaussian Variables

The information bottleneck (IB) method offers an attractive framework fo...

Please sign up or login with your details

Forgot password? Click here to reset