On the Quadratic Decaying Property of the Information Rate Function

08/27/2022
by   Michael X. Cao, et al.
0

The quadratic decaying property of the information rate function states that given a fixed conditional distribution p_𝖸|𝖷, the mutual information between the random variables 𝖷 and 𝖸 decreases at least quadratically in the distance as p_𝖷 moves away from the capacity-achieving input distributions. It is a fundamental property of the information rate function that is particularly useful in the study of higher order asymptotics and finite blocklength information theory, where it was first used by Strassen [1] and later, more explicitly, Polyanskiy-Poor-Verdú [2], [3]. Recently, while applying this result in our work, we were not able to close apparent gaps in both of these proofs. This motivates us to provide an alternative proof in this note.

READ FULL TEXT

page 1

page 2

page 3

research
04/27/2023

The Mutual Information In The Vicinity of Capacity-Achieving Input Distributions

The mutual information is analyzed as a function of the input distributi...
research
06/26/2023

A short proof of the Gács–Körner theorem

We present a short proof of a celebrated result of Gács and Körner givin...
research
10/03/2021

A Class of Nonbinary Symmetric Information Bottleneck Problems

We study two dual settings of information processing. Let 𝖸→𝖷→𝖶 be a Mar...
research
10/27/2018

Analysis of KNN Information Estimators for Smooth Distributions

KSG mutual information estimator, which is based on the distances of eac...
research
08/23/2019

Beyond the Channel Capacity of BPSK Input

The paper proposed a method that organizes a parallel transmission of tw...
research
04/27/2022

Optimal Closeness Testing of Discrete Distributions Made (Complex) Simple

In this note, we revisit the recent work of Diakonikolas, Gouleakis, Kan...

Please sign up or login with your details

Forgot password? Click here to reset