Mixture-based estimation of entropy

10/08/2020
by   Stéphane Robin, et al.
0

The entropy is a measure of uncertainty that plays a central role in information theory. When the distribution of the data is unknown, an estimate of the entropy needs be obtained from the data sample itself. We propose a semi-parametric estimate, based on a mixture model approximation of the distribution of interest. The estimate can rely on any type of mixture, but we focus on Gaussian mixture model to demonstrate its accuracy and versatility. Performance of the proposed approach is assessed through a series of simulation studies. We also illustrate its use on two real-life data examples.

READ FULL TEXT

page 17

page 19

research
09/02/2009

Scale-Based Gaussian Coverings: Combining Intra and Inter Mixture Models in Image Segmentation

By a "covering" we mean a Gaussian mixture model fit to observed data. A...
research
07/29/2021

Binomial Mixture Model With U-shape Constraint

In this article, we study the binomial mixture model under the regime th...
research
11/17/2020

Density Estimation using Entropy Maximization for Semi-continuous Data

Semi-continuous data comes from a distribution that is a mixture of the ...
research
06/01/2021

ClustRank: a Visual Quality Measure Trained on Perceptual Data for Sorting Scatterplots by Cluster Patterns

Visual quality measures (VQMs) are designed to support analysts by autom...
research
02/05/2022

Beyond Black Box Densities: Parameter Learning for the Deviated Components

As we collect additional samples from a data population for which a know...
research
02/25/2013

On learning parametric-output HMMs

We present a novel approach for learning an HMM whose outputs are distri...
research
10/23/2017

A semi-parametric estimation for max-mixture spatial processes

We proposed a semi-parametric estimation procedure in order to estimate ...

Please sign up or login with your details

Forgot password? Click here to reset