Computing Entropy Rate Of Symbol Sources & A Distribution-free Limit Theorem

01/03/2014
by   Ishanu Chattopadhyay, et al.
0

Entropy rate of sequential data-streams naturally quantifies the complexity of the generative process. Thus entropy rate fluctuations could be used as a tool to recognize dynamical perturbations in signal sources, and could potentially be carried out without explicit background noise characterization. However, state of the art algorithms to estimate the entropy rate have markedly slow convergence; making such entropic approaches non-viable in practice. We present here a fundamentally new approach to estimate entropy rates, which is demonstrated to converge significantly faster in terms of input data lengths, and is shown to be effective in diverse applications ranging from the estimation of the entropy rate of English texts to the estimation of complexity of chaotic dynamical systems. Additionally, the convergence rate of entropy estimates do not follow from any standard limit theorem, and reported algorithms fail to provide any confidence bounds on the computed values. Exploiting a connection to the theory of probabilistic automata, we establish a convergence rate of O( s /√( s )) as a function of the input length s , which then yields explicit uncertainty estimates, as well as required data lengths to satisfy pre-specified confidence bounds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset