The Structure of the Realizations of the Causal Information Rate-Distortion Function for Markovian Sources: Realizations with Densities

09/16/2018 ∙ by Milan S. Derpich, et al. ∙ 0

The main purpose of this note is to show that in a realization (x_1^n, y_1^n) of the causal information rate-distortion function (IRDF) for a κ-th order Markovian source x_1^n, under a single letter sum distortion constraint, the smallest integer ℓ for which y_k y_1^k-1,x_k-ℓ+1^k x_1^k-ℓ holds is ℓ=κ. This result is derived under the assumption that the sequences (x_1^n,y_1^n) have a joint probability density function.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Consider the causal information rate-distortion function (IRDF) for a random source , defined as

(1)

where the minimization is over all conditional PDFs satisfying the distortion constraint

(2)

and the causality Markov chains

(3)

If the infimum is achieved by some conditional distribution, the associated pair of sequences is called a realization of . Here we assume that such distribution exists and that the corresponding realization has a joint PDF. This assumption is satisfied if, for example, is Gaussian and .

The first purpose of this note is to show that in a realization of the causal IRDF for a -th order Markovian source , under the average distortion constraint (2), and supposing that in such realization the sequences have a joint PDF, it holds that

(4a)
where is the PDF of and
(4b)

The expressions given in (4) are a special case of the ones given by [1, equations (16),(17),(18)] for abstract spaces, where their derivation is not included. The value of our first result resides in that

  • We provide a proof for the validity of (4) (absent in [1]).

  • In this proof, we pose the causal IRDF optimization problem with as the decision variable (instead of the collection as would be the case in [1] for probability measures having an associated PDF). Accordingly, we impose an explicit causality constraint on , instead of enforcing causality structurally by restricting to be the product of , as done in [1, 2].

The second (and main) goal of this document is to note that from (4a) it is clear that

(5)

holds, and that

(6)

does not hold, except for . Crucially, (6) does not become true by supposing that the joint PDF of is stationary, thus contradicting [2, Remark IV.5] and what is stated in the discussion paragraph at the end of [1, Section V].

Ii Proof

The causal IRDF under the above conditions is yielded by the solution to the following optimization problem:

minimize: (7a)
subject to: (7b)
(7c)
(7d)

where the minimization is over the conditional PDF . Notice that (7d) is an explicit causality constraint equivalent to (3).

Let be any conditional PDF, and define

(8)
(9)
(10)
(11)

where .

Before writing the Lagrangian and taking its Gateaux differential, let us obtain the Gateaux differential of in the direction , given by

(12)
(13)

where

(14)
(15)
(16)
(17)

On the other hand, for each , the causality constraint (7d) appears in the Lagrangian as

(18)
(19)
(20)

It will be convenient to manipulate this expression so as to give it a structure similar to the other terms in the Lagrangian. For this purpose, notice that

(21)
(22)
(23)
(24)
(25)
(26)

where

(27)

Substituting this into (20) we obtain

(28)
(29)

We can now write the Lagrangian associated with optimization problem (7) as

(30)
(31)
(32)

From the theory of Lagrangian optimization on vector spaces 

[3], is a solution to Optimization Problem (7) only if

(33)
(34)

for every function as defined in (8), i.e., for every conditional PDF . This holds if and only if for every :

(35)
(36)

The Lagrange multiplier function must enforce the constraint (7b). Hence,

(37)

where

(38)

Marginalizing over we obtain

(39)

Using Bayes’ rule we can write

(40)
(41)

where

(42)

These functions can be written recursively as

(43a)
(43b)

In order attain causality in (41), the functions must depend only on and . Since for each , the function does not depend on terms with , the causality constraint is met if and only if we choose in (43b) such that, for each

(44)

for some function .

For , the causality constraint is satisfied automatically since (see (43a)).111 This reflects the fact that there is no need to enforce the causality constraint for , since there are no source samples for time . Suppose now that (44) (i.e., causality) is satisfied for , for some . In such case, one can replace in (44) by and, defining

write (44) as

(45)

Multiplying both sides by and integrating over we obtain

(46)