Efficient and robust high-dimensional sparse logistic regression via nonlinear primal-dual hybrid gradient algorithms

11/30/2021
by   Jérôme Darbon, et al.
0

Logistic regression is a widely used statistical model to describe the relationship between a binary response variable and predictor variables in data sets. It is often used in machine learning to identify important predictor variables. This task, variable selection, typically amounts to fitting a logistic regression model regularized by a convex combination of ℓ_1 and ℓ_2^2 penalties. Since modern big data sets can contain hundreds of thousands to billions of predictor variables, variable selection methods depend on efficient and robust optimization algorithms to perform well. State-of-the-art algorithms for variable selection, however, were not traditionally designed to handle big data sets; they either scale poorly in size or are prone to produce unreliable numerical results. It therefore remains challenging to perform variable selection on big data sets without access to adequate and costly computational resources. In this paper, we propose a nonlinear primal-dual algorithm that addresses these shortcomings. Specifically, we propose an iterative algorithm that provably computes a solution to a logistic regression problem regularized by an elastic net penalty in O(T(m,n)log(1/ϵ)) operations, where ϵ∈ (0,1) denotes the tolerance and T(m,n) denotes the number of arithmetic operations required to perform matrix-vector multiplication on a data set with m samples each comprising n features. This result improves on the known complexity bound of O(min(m^2n,mn^2)log(1/ϵ)) for first-order optimization methods such as the classic primal-dual hybrid gradient or forward-backward splitting methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset