On Finite-Time Mutual Information

04/24/2022
by   Jieao Zhu, et al.
0

Shannon-Hartley theorem can accurately calculate the channel capacity when the signal observation time is infinite. However, the calculation of finite-time mutual information, which remains unknown, is essential for guiding the design of practical communication systems. In this paper, we investigate the mutual information between two correlated Gaussian processes within a finite-time observation window. We first derive the finite-time mutual information by providing a limit expression. Then we numerically compute the mutual information within a single finite-time window. We reveal that the number of bits transmitted per second within the finite-time window can exceed the mutual information averaged over the entire time axis, which is called the exceed-average phenomenon. Furthermore, we derive a finite-time mutual information formula under a typical signal autocorrelation case by utilizing the Mercer expansion of trace class operators, and reveal the connection between the finite-time mutual information problem and the operator theory. Finally, we analytically prove the existence of the exceed-average phenomenon in this typical case, and demonstrate its compatibility with the Shannon capacity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2021

Finite-Time Capacity: Making Exceed-Shannon Possible?

Shannon-Hartley theorem can accurately calculate the channel capacity wh...
research
10/31/2021

Capacity for Electromagnetic Information Theory

Traditional channel capacity based on one-dimensional time domain mismat...
research
04/26/2019

Towards a Non-Stochastic Information Theory

The δ-mutual information between uncertain variables is introduced as a ...
research
04/12/2018

The Channel Capacity of Channelrhodopsin and Other Intensity-Driven Signal Transduction Receptors

Biological systems transduce signals from their surroundings through a m...
research
06/10/2020

On the Maximum Mutual Information Capacity of Neural Architectures

We derive the closed-form expression of the maximum mutual information -...
research
06/29/2019

Kolmogorov's Algorithmic Mutual Information Is Equivalent to Bayes' Law

Given two events A and B, Bayes' law is based on the argument that the p...

Please sign up or login with your details

Forgot password? Click here to reset