Mutual Information of Wireless Channels and Block-Jacobi Ergodic Operators

11/12/2018
by   Walid Hachem, et al.
0

Shannon's mutual information of a random multiple antenna and multipath channel is studied in the general case where the channel impulse response is an ergodic and stationary process. From this viewpoint, the channel is represented by an ergodic self-adjoint block-Jacobi operator, which is close in many aspects to a block version of a random Schrödinger operator. The mutual information is then related to the so-called density of states of this operator. In this paper, it is shown that under the weakest assumptions on the channel, the mutual information can be expressed in terms of a matrix-valued stochastic process coupled with the channel process. This allows numerical approximations of the mutual information in this general setting. Moreover, assuming further that the channel impulse response is a Markov process, a representation for the mutual information offset in the large Signal to Noise Ratio regime is obtained in terms of another related Markov process. This generalizes previous results from Levy et.al.. It is also illustrated how the mutual information expressions that are closely related to those predicted by the random matrix theory can be recovered in the large dimensional regime.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2018

Channel Dependent Mutual Information in Index Modulations

Mutual Information is the metric that is used to perform link adaptation...
research
07/15/2020

Inference and mutual information on random factor graphs

Random factor graphs provide a powerful framework for the study of infer...
research
04/26/2019

Towards a Non-Stochastic Information Theory

The δ-mutual information between uncertain variables is introduced as a ...
research
01/19/2019

Ergodic MIMO Mutual Information: Twenty Years After Emre Telatar

In the celebrated work of Emre Telatar in the year 1999 (14274 citations...
research
03/04/2023

Derivatives of mutual information in Gaussian channels

We derive a general formula for the derivatives of mutual information be...
research
06/24/2022

Mutual-Information Based Optimal Experimental Design for Hyperpolarized ^13C-Pyruvate MRI

A key parameter of interest recovered from hyperpolarized (HP) MRI measu...
research
02/07/2017

Trimming the Independent Fat: Sufficient Statistics, Mutual Information, and Predictability from Effective Channel States

One of the most fundamental questions one can ask about a pair of random...

Please sign up or login with your details

Forgot password? Click here to reset