DeepAI AI Chat
Log In Sign Up

Unifying Design-based Inference: A New Variance Estimation Principle

09/19/2021
by   Joel A. Middleton, et al.
0

This paper presents two novel classes of variance estimators with superior properties, in the absence of parametric or semi-parametricassumptions. The first new class of estimator is the Oblozene Chlebizky (OC) variance estimators as a novel alternative to the generalized sandwich in Paper 1 of 4. That the OC concept is unlikely to arise from other, more standard, frameworks is manifestly true in light of the 40 year lacuna since White (1980). For any member of the generalized sandwich variance estimator class, there is an OC with the same expected value. The this alternative replaces a random matrix at the center with a nonrandom one. The second type of estimator is guaranteed conservative for the variance of the estimator and is based upon a similar principle of replacing a random matrix with its expectation.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/18/2021

On Generalized Schürmann Entropy Estimators

We present a new class of estimators of Shannon entropy for severely und...
07/14/2021

Generalized Covariance Estimator

We consider a class of semi-parametric dynamic models with strong white ...
11/30/2021

Martingale product estimators for sensitivity analysis in computational statistical physics

We introduce a new class of estimators for the linear response of steady...
06/28/2018

Risk-averse estimation, an axiomatic approach to inference, and Wallace-Freeman without MML

We define a new class of Bayesian point estimators, which we refer to as...