Generalized maximum likelihood estimation of the mean of parameters of mixtures, with applications to sampling
Let f(y|θ), θ∈Ω be a parametric family, η(θ) a given function, and G an unknown mixing distribution. It is desired to estimate E_G (η(θ))≡η_G based on independent observations Y_1,...,Y_n, where Y_i ∼ f(y|θ_i), and θ_i ∼ G are iid. We explore the Generalized Maximum Likelihood Estimators (GMLE) for this problem. Some basic properties and representations of those estimators are shown. In particular we suggest a new perspective, of the weak convergence result by Kiefer and Wolfowitz (1956), with implications to a corresponding setup in which θ_1,...,θ_n are fixed parameters. We also relate the above problem, of estimating η_G, to non-parametric empirical Bayes estimation under a squared loss. Applications of GMLE to sampling problems are presented. The performance of the GMLE is demonstrated both in simulations and through a real data example.
READ FULL TEXT