Shannon entropy estimation for linear processes

09/08/2020
by   Timothy Fortune, et al.
0

In this paper, we estimate the Shannon entropy S(f) = -[ log (f(x))] of a one-sided linear process with probability density function f(x). We employ the integral estimator S_n(f), which utilizes the standard kernel density estimator f_n(x) of f(x). We show that S_n (f) converges to S(f) almost surely and in Ł^2 under reasonable conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2021

New identities for the Shannon function and applications

We show how the Shannon entropy function H(p,q)is expressible as a linea...
research
02/19/2018

On the computation of Shannon Entropy from Counting Bloom Filters

In this short note a method for computing the naive plugin estimator of ...
research
10/07/2022

Kernel entropy estimation for long memory linear processes with infinite variance

Let X={X_n: n∈ℕ} be a long memory linear process with innovations in the...
research
12/01/2017

Kernel entropy estimation for linear processes

Let {X_n: n∈N} be a linear process with bounded probability density func...
research
05/08/2021

Understanding Neural Networks with Logarithm Determinant Entropy Estimator

Understanding the informative behaviour of deep neural networks is chall...
research
01/30/2022

Polynomial functors and Shannon entropy

Past work shows that one can associate a notion of Shannon entropy to a ...
research
07/11/2021

Theoretical Limit of Radar Parameter Estimation

In the field of radar parameter estimation, Cramer-Rao bound (CRB) is a ...

Please sign up or login with your details

Forgot password? Click here to reset