On Predictive Density Estimation under α-divergence Loss

06/07/2018
by   Aziz L'Moudden, et al.
0

Based on X ∼ N_d(θ, σ^2_X I_d), we study the efficiency of predictive densities under α-divergence loss L_α for estimating the density of Y ∼ N_d(θ, σ^2_Y I_d). We identify a large number of cases where improvement on a plug-in density are obtainable by expanding the variance, thus extending earlier findings applicable to Kullback-Leibler loss. The results and proofs are unified with respect to the dimension d, the variances σ^2_X and σ^2_Y, the choice of loss L_α; α∈ (-1,1). The findings also apply to a large number of plug-in densities, as well as for restricted parameter spaces with θ∈Θ⊂R^d. The theoretical findings are accompanied by various observations, illustrations, and implications dealing for instance with robustness with respect to the model variances and simultaneous dominance with respect to the loss.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset