On the complexity of logistic regression models

03/01/2019
by   Nicola Bulso, et al.
0

We investigate the complexity of logistic regression models which is defined by counting the number of indistinguishable distributions that the model can represent (Balasubramanian, 1997). We find that the complexity of logistic models with binary inputs does not only depend on the number of parameters but also on the distribution of inputs in a non-trivial way which standard treatments of complexity do not address. In particular, we observe that correlations among inputs induce effective dependencies among parameters thus constraining the model and, consequently, reducing its complexity. We derive simple relations for the upper and lower bounds of the complexity. Furthermore, we show analytically that, defining the model parameters on a finite support rather than the entire axis, decreases the complexity in a manner that critically depends on the size of the domain. Based on our findings, we propose a novel model selection criterion which takes into account the entropy of the input distribution. We test our proposal on the problem of selecting the input variables of a logistic regression model in a Bayesian Model Selection framework. In our numerical tests, we find that, while the reconstruction errors of standard model selection approaches (AIC, BIC, ℓ_1 regularization) strongly depend on the sparsity of the ground truth, the reconstruction error of our method is always close to the minimum in all conditions of sparsity, data size and strength of input correlations. Finally, we observe that, when considering categorical instead of binary inputs, in a simple and mathematically tractable case, the contribution of the alphabet size to the complexity is very small compared to that of parameter space dimension. We further explore the issue by analysing the dataset of the "13 keys to the White House" which is a method for forecasting the outcomes of US presidential elections.

READ FULL TEXT
research
08/05/2014

Volumes of logistic regression models with applications to model selection

Logistic regression models with n observations and q linearly-independen...
research
11/09/2022

Single Parameter Inference of Non-sparse Logistic Regression Models

This paper infers a single parameter in non-sparse logistic regression m...
research
07/26/2022

Minimum Sample Size for Developing a Multivariable Prediction Model using Multinomial Logistic Regression

Multinomial logistic regression models allow one to predict the risk of ...
research
03/20/2012

Selection of tuning parameters in bridge regression models via Bayesian information criterion

We consider the bridge linear regression modeling, which can produce a s...
research
05/13/2015

Bootstrapped Adaptive Threshold Selection for Statistical Model Selection and Estimation

A central goal of neuroscience is to understand how activity in the nerv...
research
11/07/2018

Interpreting the Ising Model: The Input Matters

The Ising model is a widely used model for multivariate binary data. It ...
research
04/17/2019

Correlated Logistic Model With Elastic Net Regularization for Multilabel Image Classification

In this paper, we present correlated logistic (CorrLog) model for multil...

Please sign up or login with your details

Forgot password? Click here to reset