Outperforming Good-Turing: Preliminary Report

07/06/2018
by   Amichai Painsky, et al.
0

Estimating a large alphabet probability distribution from a limited number of samples is a fundamental problem in machine learning and statistics. A variety of estimation schemes have been proposed over the years, mostly inspired by the early work of Laplace and the seminal contribution of Good and Turing. One of the basic assumptions shared by most commonly-used estimators is the unique correspondence between the symbol's sample frequency and its estimated probability. In this work we tackle this paradigmatic assumption; we claim that symbols with "similar" frequencies shall be assigned the same estimated probability value. This way we regulate the number of parameters and improve generalization. In this preliminary report we show that by applying an ensemble of such regulated estimators, we introduce a dramatic enhancement in the estimation accuracy (typically up to 50 An implementation of our suggested method is publicly available at the first author's web-page.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2018

On consistent estimation of the missing mass

Given n samples from a population of individuals belonging to different ...
research
01/28/2019

Exact Good-Turing characterization of the two-parameter Poisson-Dirichlet superpopulation model

Large sample size equivalence between the celebrated approximated Good-...
research
07/03/2022

Low probability states, data statistics, and entropy estimation

A fundamental problem in analysis of complex systems is getting a reliab...
research
12/21/2020

Neural Joint Entropy Estimation

Estimating the entropy of a discrete random variable is a fundamental pr...
research
03/04/2019

Data Amplification: Instance-Optimal Property Estimation

The best-known and most commonly used distribution-property estimation t...
research
04/04/2022

Estimating the Entropy of Linguistic Distributions

Shannon entropy is often a quantity of interest to linguists studying th...

Please sign up or login with your details

Forgot password? Click here to reset