A Note on the Shannon Entropy of Short Sequences

07/07/2018
by   H. M. de Oliveira, et al.
0

For source sequences of length L symbols we proposed to use a more realistic value to the usual benchmark of number of code letters by source letters. Our idea is based on a quantifier of information fluctuation of a source, F(U), which corresponds to the second central moment of the random variable that measures the information content of a source symbol. An alternative interpretation of typical sequences is additionally provided through this approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2020

Entropy of tropical holonomic sequences

We introduce tropical holonomic sequences of a given order and calculate...
research
05/19/2019

Optimal Guessing under Nonextensive Framework and associated Moment Bounds

We consider the problem of guessing the realization of a random variable...
research
10/01/2015

Similarity of symbol frequency distributions with heavy tails

Quantifying the similarity between symbolic sequences is a traditional p...
research
09/11/2022

Towards Weak Source Coding

In this paper, the authors provide a weak decoding version of the tradit...
research
03/21/2002

Entropy estimation of symbol sequences

We discuss algorithms for estimating the Shannon entropy h of finite sym...
research
06/12/2020

Analysis of Nonlinear Fiber Interactions for Finite-Length Constant-Composition Sequences

In order to realize probabilistically shaped signaling within the probab...
research
11/23/2022

On the Typicality of Musical Sequences

It has been shown in a recent publication that words in human-produced E...

Please sign up or login with your details

Forgot password? Click here to reset