A Greedy Homotopy Method for Regression with Nonconvex Constraints

10/27/2014
by   Fabian L. Wauthier, et al.
0

Constrained least squares regression is an essential tool for high-dimensional data analysis. Given a partition G of input variables, this paper considers a particular class of nonconvex constraint functions that encourage the linear model to select a small number of variables from a small number of groups in G. Such constraints are relevant in many practical applications, such as Genome-Wide Association Studies (GWAS). Motivated by the efficiency of the Lasso homotopy method, we present RepLasso, a greedy homotopy algorithm that tries to solve the induced sequence of nonconvex problems by solving a sequence of suitably adapted convex surrogate problems. We prove that in some situations RepLasso recovers the global minima of the nonconvex problem. Moreover, even if it does not recover global minima, we prove that in relevant cases it will still do no worse than the Lasso in terms of support and signed support recovery, while in practice outperforming it. We show empirically that the strategy can also be used to improve over other Lasso-style algorithms. Finally, a GWAS of ankylosing spondylitis highlights our method's practical utility.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/17/2014

Support recovery without incoherence: A case for nonconvex regularization

We demonstrate that the primal-dual witness proof method may be used to ...
research
09/10/2012

Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors

Penalized regression is an attractive framework for variable selection p...
research
09/29/2022

On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling Walks

Classical algorithms are often not effective for solving nonconvex optim...
research
04/05/2020

Regularized asymptotic descents for nonconvex optimization

In this paper we propose regularized asymptotic descent (RAD) methods fo...
research
11/25/2015

L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

It is known that for a certain class of single index models (SIMs) Y = f...
research
05/26/2017

Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

We propose the residual expansion (RE) algorithm: a global (or near-glob...
research
07/20/2019

Distributed Global Optimization by Annealing

The paper considers a distributed algorithm for global minimization of a...

Please sign up or login with your details

Forgot password? Click here to reset