Takeuchi's Information Criteria as a form of Regularization

03/13/2018 ∙ by Matthew Dixon, et al. ∙ 0

Takeuchi's Information Criteria (TIC) is a linearization of maximum likelihood estimator bias which shrinks the model parameters towards the maximum entropy distribution, even when the model is mis-specified. In statistical machine learning, L_2 regularization (a.k.a. ridge regression) also introduces a parameterized bias term with the goal of minimizing out-of-sample entropy, but generally requires a numerical solver to find the regularization parameter. This paper presents a novel regularization approach based on TIC; the approach does not assume a data generation process and results in a higher entropy distribution through more efficient sample noise suppression. The resulting objective function can be directly minimized to estimate and select the best model, without the need to select a regularization parameter, as in ridge regression. Numerical results applied to a synthetic high dimensional dataset generated from a logistic regression model demonstrate superior model performance when using the TIC based regularization over a L_1 and a L_2 penalty term.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.