On Tsallis Entropy Bias and Generalized Maximum Entropy Models

04/07/2010
by   Yuexian Hou, et al.
0

In density estimation task, maximum entropy model (Maxent) can effectively use reliable prior information via certain constraints, i.e., linear constraints without empirical parameters. However, reliable prior information is often insufficient, and the selection of uncertain constraints becomes necessary but poses considerable implementation complexity. Improper setting of uncertain constraints can result in overfitting or underfitting. To solve this problem, a generalization of Maxent, under Tsallis entropy framework, is proposed. The proposed method introduces a convex quadratic constraint for the correction of (expected) Tsallis entropy bias (TEB). Specifically, we demonstrate that the expected Tsallis entropy of sampling distributions is smaller than the Tsallis entropy of the underlying real distribution. This expected entropy reduction is exactly the (expected) TEB, which can be expressed by a closed-form formula and act as a consistent and unbiased correction. TEB indicates that the entropy of a specific sampling distribution should be increased accordingly. This entails a quantitative re-interpretation of the Maxent principle. By compensating TEB and meanwhile forcing the resulting distribution to be close to the sampling distribution, our generalized TEBC Maxent can be expected to alleviate the overfitting and underfitting. We also present a connection between TEB and Lidstone estimator. As a result, TEB-Lidstone estimator is developed by analytically identifying the rate of probability correction in Lidstone. Extensive empirical evaluation shows promising performance of both TEBC Maxent and TEB-Lidstone in comparison with various state-of-the-art density estimation methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/10/2020

Generalized Maximum Entropy for Supervised Classification

The maximum entropy principle advocates to evaluate events' probabilitie...
research
09/09/2021

Notes on Generalizing the Maximum Entropy Principle to Uncertain Data

The principle of maximum entropy is a broadly applicable technique for c...
research
05/17/2023

The Principle of Uncertain Maximum Entropy

The principle of maximum entropy, as introduced by Jaynes in information...
research
06/07/2023

MESSY Estimation: Maximum-Entropy based Stochastic and Symbolic densitY Estimation

We introduce MESSY estimation, a Maximum-Entropy based Stochastic and Sy...
research
06/25/2020

Maximum Multiscale Entropy and Neural Network Regularization

A well-known result across information theory, machine learning, and sta...
research
11/17/2020

Density Estimation using Entropy Maximization for Semi-continuous Data

Semi-continuous data comes from a distribution that is a mixture of the ...
research
06/09/2020

AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation

Entropy is ubiquitous in machine learning, but it is in general intracta...

Please sign up or login with your details

Forgot password? Click here to reset