SLOE: A Faster Method for Statistical Inference in High-Dimensional Logistic Regression

03/23/2021
by   Steve Yadlowsky, et al.
0

Logistic regression remains one of the most widely used tools in applied statistics, machine learning and data science. However, in moderately high-dimensional problems, where the number of features d is a non-negligible fraction of the sample size n, the logistic regression maximum likelihood estimator (MLE), and statistical procedures based the large-sample approximation of its distribution, behave poorly. Recently, Sur and Candès (2019) showed that these issues can be corrected by applying a new approximation of the MLE's sampling distribution in this high-dimensional regime. Unfortunately, these corrections are difficult to implement in practice, because they require an estimate of the signal strength, which is a function of the underlying parameters β of the logistic regression. To address this issue, we propose SLOE, a fast and straightforward approach to estimate the signal strength in logistic regression. The key insight of SLOE is that the Sur and Candès (2019) correction can be reparameterized in terms of the corrupted signal strength, which is only a function of the estimated parameters β. We propose an estimator for this quantity, prove that it is consistent in the relevant high-dimensional regime, and show that dimensionality correction using SLOE is accurate in finite samples. Compared to the existing ProbeFrontier heuristic, SLOE is conceptually simpler and orders of magnitude faster, making it suitable for routine use. We demonstrate the importance of routine dimensionality correction in the Heart Disease dataset from the UCI repository, and a genomics application using data from the UK Biobank. We provide an open source package for this method, available at <https://github.com/google-research/sloe-logistic>.

READ FULL TEXT
01/26/2018

A note on "MLE in logistic regression with a diverging dimension"

This short note is to point the reader to notice that the proof of high ...
06/10/2019

The Impact of Regularization on High-dimensional Logistic Regression

Logistic regression is commonly used for modeling dichotomous outcomes. ...
03/19/2018

A modern maximum-likelihood theory for high-dimensional logistic regression

Every student in statistics or data science learns early on that when th...
05/29/2022

A Conditional Randomization Test for Sparse Logistic Regression in High-Dimension

Identifying the relevant variables for a classification model with corre...
02/11/2020

A Non-Intrusive Correction Algorithm for Classification Problems with Corrupted Data

A novel correction algorithm is proposed for multi-class classification ...
08/01/2020

Two-step penalised logistic regression for multi-omic data with an application to cardiometabolic syndrome

Building classification models that predict a binary class label on the ...
06/05/2017

The Likelihood Ratio Test in High-Dimensional Logistic Regression Is Asymptotically a Rescaled Chi-Square

Logistic regression is used thousands of times a day to fit data, predic...