DeepAI AI Chat
Log In Sign Up

An Elementary Proof of a Classical Information-Theoretic Formula

09/05/2018
by   Xianming Liu, et al.
Technion
Huazhong University of Science u0026 Technology
The University of Hong Kong
0

A renowned information-theoretic formula by Shannon expresses the mutual information rate of a white Gaussian channel with a stationary Gaussian input as an integral of a simple function of the power spectral density of the channel input. We give in this paper a rigorous yet elementary proof of this classical formula. As opposed to all the conventional approaches, which either rely on heavy mathematical machineries or have to resort to some "external" results, our proof, which hinges on a recently proven sampling theorem, is elementary and self-contained, only using some well-known facts from basic calculus and matrix theory.

READ FULL TEXT
12/24/2019

A note on the elementary HDX construction of Kaufman-Oppenheim

In this note, we give a self-contained and elementary proof of the eleme...
08/15/2022

An Elementary Proof of the Generalization of the Binet Formula for k-bonacci Numbers

We present an elementary proof of the generalization of the k-bonacci Bi...
04/08/2021

An Information-Theoretic Proof of a Finite de Finetti Theorem

A finite form of de Finetti's representation theorem is established usin...
12/02/2020

Information Theory in Density Destructors

Density destructors are differentiable and invertible transforms that ma...
12/12/2019

An Integral Representation of the Logarithmic Function with Applications in Information Theory

We explore a well-known integral representation of the logarithmic funct...
08/09/2020

Representative elementary volume via averaged scalar Minkowski functionals

Representative Elementary Volume (REV) at which the material properties ...