A Geometric Property of Relative Entropy and the Universal Threshold Phenomenon for Binary-Input Channels with Noisy State Information at the Encoder

by   Shengtian Yang, et al.

Tight lower and upper bounds on the ratio of relative entropies of two probability distributions with respect to a common third one are established, where the three distributions are collinear in the standard (n-1)-simplex. These bounds are leveraged to analyze the capacity of an arbitrary binary-input channel with noisy causal state information (provided by a side channel) at the encoder and perfect state information at the decoder, and in particular to determine the exact universal threshold on the noise measure of the side channel, above which the capacity is the same as that with no encoder side information.


page 1

page 2

page 3

page 4


Tight Upper Bounds on the Error Probability of Spinal Codes over Fading Channels

Spinal codes, a family of rateless codes introduced in 2011, have been p...

Bounds on the Feedback Capacity of the (d,∞)-RLL Input-Constrained Binary Erasure Channel

The paper considers the input-constrained binary erasure channel (BEC) w...

On a 2-relative entropy

We construct a 2-categorical extension of the relative entropy functor o...

Feedback Capacity and Coding for the (0,k)-RLL Input-Constrained BEC

The input-constrained binary erasure channel (BEC) with strictly causal ...

On a Class of Time-Varying Gaussian ISI Channels

This paper studies a class of stochastic and time-varying Gaussian inter...

Strong coordination of signals and actions over noisy channels with two-sided state information

We consider a network of two nodes separated by a noisy channel with two...

Information Theoretic Bounds on Optimal Worst-case Error in Binary Mixture Identification

Identification of latent binary sequences from a pool of noisy observati...

Please sign up or login with your details

Forgot password? Click here to reset