DeepAI AI Chat
Log In Sign Up

An Elementary Proof of a Classical Information-Theoretic Formula

by   Xianming Liu, et al.
Huazhong University of Science u0026 Technology
The University of Hong Kong

A renowned information-theoretic formula by Shannon expresses the mutual information rate of a white Gaussian channel with a stationary Gaussian input as an integral of a simple function of the power spectral density of the channel input. We give in this paper a rigorous yet elementary proof of this classical formula. As opposed to all the conventional approaches, which either rely on heavy mathematical machineries or have to resort to some "external" results, our proof, which hinges on a recently proven sampling theorem, is elementary and self-contained, only using some well-known facts from basic calculus and matrix theory.


A note on the elementary HDX construction of Kaufman-Oppenheim

In this note, we give a self-contained and elementary proof of the eleme...

An Elementary Proof of the Generalization of the Binet Formula for k-bonacci Numbers

We present an elementary proof of the generalization of the k-bonacci Bi...

An Information-Theoretic Proof of a Finite de Finetti Theorem

A finite form of de Finetti's representation theorem is established usin...

Information Theory in Density Destructors

Density destructors are differentiable and invertible transforms that ma...

An Integral Representation of the Logarithmic Function with Applications in Information Theory

We explore a well-known integral representation of the logarithmic funct...

Representative elementary volume via averaged scalar Minkowski functionals

Representative Elementary Volume (REV) at which the material properties ...