Range entropy: A bridge between signal complexity and self-similarity

09/18/2018
by   Amir Omidvarnia, et al.
0

Sample entropy (SampEn) has been accepted as an alternate, and sometimes a replacement, measure to approximate entropy (ApEn) for characterizing temporal complexity of time series. However, it still suffers from issues such as inconsistency over short-length signals and its tolerance parameter r, susceptibility to signal amplitude changes and insensitivity to self-similarity of time series. We propose modifications to the ApEn and SampEn measures which are defined for 0<r<1, are more robust to signal amplitude changes and sensitive to self-similarity property of time series. We modified ApEn and SampEn by redefining the distance function used originally in their definitions. We then evaluated the new entropy measures, called range entropies (RangeEn) using different random processes and nonlinear deterministic signals. We further applied the proposed entropies to normal and epileptic electroencephalographic (EEG) signals under different states. Our results suggest that, unlike ApEn and SampEn, RangeEn measures are robust to stationary and nonstationary signal amplitude variations and that their trajectories in the tolerance r-plane are constrained between 0 (maximum entropy) and 1 (minimum entropy). We also showed that RangeEn have direct relationships with the Hurst exponent; suggesting that the new definitions are sensitive to self-similarity structures of signals. RangeEn analysis of epileptic EEG data showed distinct behaviours in the r-domain for extracranial versus intracranial recordings as well as different states of epileptic EEG data. The constrained trajectory of RangeEn in the r-plane makes them a good candidate for studying complex biological signals such as EEG during seizure and non-seizure states. The Python package used to generate the results shown in this paper is publicly available at: https://github.com/omidvarnia/RangeEn.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2022

Novel techniques for improvement the NNetEn entropy calculation for short and noisy time series

Entropy is a fundamental concept of information theory. It is widely use...
research
07/18/2021

A method for estimating the entropy of time series using artificial neural network

Measuring the predictability and complexity of time series is an essenti...
research
01/06/2022

Ecce Signum: An R Package for Multivariate Signal Extraction and Time Series Analysis

The package provides multivariate time series models for structural anal...
research
04/05/2017

Automated Diagnosis of Epilepsy Employing Multifractal Detrended Fluctuation Analysis Based Features

This contribution reports an application of MultiFractal Detrended Fluct...
research
06/13/2019

Comparison of Methods for the Assessment of Nonlinearity in Short-Term Heart Rate Variability under different Physiopathological States

Despite the widespread diffusion of nonlinear methods for heart rate var...
research
10/13/2018

A Geometric Analysis of Time Series Leading to Information Encoding and a New Entropy Measure

A time series is uniquely represented by its geometric shape, which also...

Please sign up or login with your details

Forgot password? Click here to reset