A proof of consistency and model-selection optimality on the empirical Bayes method

05/26/2022
by   Dye SK Sato, et al.
0

We study the consistency and optimality of the maximum marginal likelihood estimate (MMLE) in the hyperparameter inference for large-degree-of-freedom models. We perform main analyses within the exponential family, where the natural parameters are hyperparameters. First, we prove the consistency of the MMLE for the general linear models when estimating the scales of variance in the likelihood and prior. The proof is independent of the number ratio of data to model parameters and excepts the ill-posedness of the associated regularized least-square model-parameter estimate that is shown asymptotically unbiased. Second, we generalize the proof to other models with a finite number of hyperparameters. We find that the extensive properties of cost functions in the exponential family generally yield the consistency of the MMLE for the likelihood hyperparameters. Besides, we show the MMLE asymptotically almost surely minimizes the Kullback-Leibler divergence between the prior and true predictive distributions even if the true data distribution is outside the model space under the hypothetical asymptotic normality of the predictive distributions applicable to non-exponential model families. Our proof validates the empirical Bayes method using the hyperparameter MMLE in the asymptotics of many model parameters, ensuring the same qualification for the empirical-cross-entropy cross-validation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset