Directed Data-Processing Inequalities for Systems with Feedback

03/25/2021
by   Milan S. Derpich, et al.
0

We present novel data-processing inequalities relating the mutual information and the directed information in systems with feedback. The internal blocks within such systems are restricted only to be causal mappings, but are allowed to be non-linear, stochastic and time varying. These blocks can for example represent source encoders, decoders or even communication channels. Moreover, the involved signals can be arbitrarily distributed. Our first main result relates mutual and directed informations and can be interpreted as a law of conservation of information flow. Our second main result is a pair of data-processing inequalities (one the conditional version of the other) between nested pairs of random sequences entirely within the closed loop. Our third main result is introducing and characterizing the notion of in-the-loop (ITL) transmission rate for channel coding scenarios in which the messages are internal to the loop. Interestingly, in this case the conventional notions of transmission rate associated with the entropy of the messages and of channel capacity based on maximizing the mutual information between the messages and the output turn out to be inadequate. Instead, as we show, the ITL transmission rate is the unique notion of rate for which a channel code attains zero error probability if and only if such ITL rate does not exceed the corresponding directed information rate from messages to decoded messages. We apply our data-processing inequalities to show that the supremum of achievable (in the usual channel coding sense) ITL transmission rates is upper bounded by the supremum of the directed information rate across the communication channel. Moreover, we present an example in which this upper bound is attained. Finally, ...

READ FULL TEXT

page 2

page 4

page 5

page 6

page 8

page 16

page 19

page 20

research
02/11/2021

Fisher Information and Mutual Information Constraints

We consider the processing of statistical samples X∼ P_θ by a channel p(...
research
05/23/2016

A note on the expected minimum error probability in equientropic channels

While the channel capacity reflects a theoretical upper bound on the ach...
research
10/13/2022

On the Reliability Function for a BSC with Noiseless Feedback at Zero Rate

We consider the transmission of nonexponentially many messages through a...
research
11/13/2019

Proofs of conservation inequalities for Levin's notion of mutual information of 1974

In this paper we consider Levin's notion of mutual information in infini...
research
11/06/2019

Conditional Mutual Information Neural Estimator

Several recent works in communication systems have proposed to leverage ...
research
12/22/2022

Markov Categories and Entropy

Markov categories are a novel framework to describe and treat problems i...
research
05/15/2021

On Conditional α-Information and its Application to Side-Channel Analysis

A conditional version of Sibson's α-information is defined using a simpl...

Please sign up or login with your details

Forgot password? Click here to reset