Chain rules for one-shot entropic quantities via operational methods

05/15/2023
by   Sayantan Chakraborty, et al.
0

We introduce a new operational technique for deriving chain rules for general information theoretic quantities. This technique is very different from the popular (and in some cases fairly involved) methods like SDP formulation and operator algebra or norm interpolation. Instead, our framework considers a simple information transmission task and obtains lower and upper bounds for it. The lower bounds are obtained by leveraging a successive cancellation encoding and decoding technique. Pitting the upper and lower bounds against each other gives us the desired chain rule. As a demonstration of this technique, we derive chain rules for the smooth max mutual information and the smooth-Hypothesis testing mutual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2020

Upper Bounds on the Generalization Error of Private Algorithms

In this work, we study the generalization capability of algorithms from ...
research
03/02/2022

Chained Generalisation Bounds

This work discusses how to derive upper bounds for the expected generali...
research
11/08/2021

Information-Theoretic Bayes Risk Lower Bounds for Realizable Models

We derive information-theoretic lower bounds on the Bayes risk and gener...
research
12/12/2018

From asymptotic hypothesis testing to entropy inequalities

This thesis addresses the interplay between asymptotic hypothesis testin...
research
02/08/2021

Mutual Information of Neural Network Initialisations: Mean Field Approximations

The ability to train randomly initialised deep neural networks is known ...
research
05/22/2023

Fundamental connections between utility theories of wealth and information theory

We establish fundamental connections between utility theories of wealth ...

Please sign up or login with your details

Forgot password? Click here to reset