Robust subset selection
The best subset selection (or "best subsets") estimator is a classic tool for sparse regression, and developments in mathematical optimization over the past decade have made it more computationally tractable than ever. Notwithstanding its desirable statistical properties, the best subsets estimator is susceptible to outliers and can break down in the presence of a single contaminated data point. To address this issue, we propose a robust adaption of best subsets that is highly resistant to contamination in both the response and the predictors. Our estimator generalizes the notion of subset selection to both predictors and observations, thereby achieving robustness in addition to sparsity. This procedure, which we call "robust subset selection" (or "robust subsets"), is defined by a combinatorial optimization problem for which we apply modern discrete optimization methods. We formally establish the robustness of our estimator in terms of the finite-sample breakdown point of its objective value. In support of this result, we report experiments on both synthetic and real data that demonstrate the superiority of robust subsets over best subsets in the presence of contamination. Importantly, robust subsets fares competitively across several metrics compared with popular robust adaptions of the Lasso.
READ FULL TEXT