A Robust Consistent Information Criterion for Model Selection based on Empirical Likelihood

06/23/2020 ∙ by Chixiang Chen, et al. ∙ 0

Conventional likelihood-based information criteria for model selection rely on the distribution assumption of data. However, for complex data that are increasingly available in many scientific fields, the specification of their underlying distribution turns out to be challenging, and the existing criteria may be limited and are not general enough to handle a variety of model selection problems. Here, we propose a robust and consistent model selection criterion based upon the empirical likelihood function which is data-driven. In particular, this framework adopts plug-in estimators that can be achieved by solving external estimating equations, not limited to the empirical likelihood, which avoids potential computational convergence issues and allows versatile applications, such as generalized linear models, generalized estimating equations, penalized regressions and so on. The formulation of our proposed criterion is initially derived from the asymptotic expansion of the marginal likelihood under variable selection framework, but more importantly, the consistent model selection property is established under a general context. Extensive simulation studies confirm the out-performance of the proposal compared to traditional model selection criteria. Finally, an application to the Atherosclerosis Risk in Communities Study illustrates the practical value of this proposed framework.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.