Nonparametric Estimation of the Fisher Information and Its Applications

05/07/2020
by   Wei Cao, et al.
10

This paper considers the problem of estimation of the Fisher information for location from a random sample of size n. First, an estimator proposed by Bhattacharya is revisited and improved convergence rates are derived. Second, a new estimator, termed a clipped estimator, is proposed. Superior upper bounds on the rates of convergence can be shown for the new estimator compared to the Bhattacharya estimator, albeit with different regularity conditions. Third, both of the estimators are evaluated for the practically relevant case of a random variable contaminated by Gaussian noise. Moreover, using Brown's identity, which relates the Fisher information and the minimum mean squared error (MMSE) in Gaussian noise, two corresponding consistent estimators for the MMSE are proposed. Simulation examples for the Bhattacharya estimator and the clipped estimator as well as the MMSE estimators are presented. The examples demonstrate that the clipped estimator can significantly reduce the required sample size to guarantee a specific confidence interval compared to the Bhattacharya estimator.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2022

Weighted Distributed Estimation under Heterogeneity

This paper considers distributed M-estimation under heterogeneous distri...
research
10/03/2019

Exploring Positive Noise in Estimation Theory

Estimation of a deterministic quantity observed in non-Gaussian additive...
research
06/08/2021

Entropy of the Conditional Expectation under Gaussian Noise

This paper considers an additive Gaussian noise channel with arbitrarily...
research
01/18/2019

On Estimation under Noisy Order Statistics

This paper presents an estimation framework to assess the performance of...
research
09/13/2016

Information Theoretic Structure Learning with Confidence

Information theoretic measures (e.g. the Kullback Liebler divergence and...
research
01/31/2019

The Relation Between Bayesian Fisher Information and Shannon Information for Detecting a Change in a Parameter

We derive a connection between performance of estimators the performance...
research
04/05/2021

A General Derivative Identity for the Conditional Mean Estimator in Gaussian Noise and Some Applications

Consider a channel Y= X+ N where X is an n-dimensional random vector, a...

Please sign up or login with your details

Forgot password? Click here to reset