The CCP Selector: Scalable Algorithms for Sparse Ridge Regression from Chance-Constrained Programming
Sparse regression and variable selection for large-scale data have been rapidly developed in the past decades. This work focuses on sparse ridge regression, which considers the exact L_0 norm to pursue the sparsity. We pave out a theoretical foundation to understand why many existing approaches may not work well for this problem, in particular on large-scale datasets. Inspired by reformulating the problem as a chance-constrained program, we derive a novel mixed integer second order conic (MISOC) reformulation and prove that its continuous relaxation is equivalent to that of the convex integer formulation proposed in a recent work. Based upon these two formulations, we develop two new scalable algorithms, the greedy and randomized algorithms, for sparse ridge regression with desirable theoretical properties. The proposed algorithms are proved to yield near-optimal solutions under mild conditions. In the case of much larger dimensions, we propose to integrate the greedy algorithm with the randomized algorithm, which can greedily search the features from the nonzero subset identified by the continuous relaxation of the MISOC formulation. The merits of the proposed methods are elaborated through a set of numerical examples in comparison with several existing ones.
READ FULL TEXT