Robust supervised classification and feature selection using a primal-dual method

02/05/2019
by   Michel Barlaud, et al.
0

This paper deals with supervised classification and feature selection in high dimensional space. A classical approach is to project data on a low dimensional space and classify by minimizing an appropriate quadratic cost. A strict control on sparsity is moreover obtained by adding an ℓ_1 constraint, here on the matrix of weights used for projecting the data. Tuning the sparsity bound results in selecting the relevant features for supervised classification. It is well known that using a quadratic cost is not robust to outliers. We cope with this problem by using an ℓ_1 norm both for the constraint and for the loss function. In this case, the criterion is convex but not gradient Lipschitz anymore. Another second issue is that we optimize simultaneously the projection matrix and the centers used for classification. In this paper, we provide a novel tailored constrained primal-dual method to compute jointly selected features and classifiers. Extending our primal-dual method to other criteria is easy provided that efficient projection (on the dual ball for the loss data term) and prox (for the regularization term) algorithms are available. We illustrate such an extension in the case of a Frobenius norm for the loss term. We provide a convergence proof of our primal-dual method, and demonstrate its effectiveness on three datasets (one synthetic, two from biological data) on which we compare ℓ_1 and ℓ_2 costs.

READ FULL TEXT

page 8

page 9

research
07/19/2023

Near-Linear Time Projection onto the ℓ_1,∞ Ball; Application to Sparse Autoencoders

Looking for sparsity is nowadays crucial to speed up the training of lar...
research
12/08/2014

Minkowski sum of HV-polytopes in Rn

Minkowski sums cover a wide range of applications in many different fiel...
research
01/06/2021

Bayesian Inference of Random Dot Product Graphs via Conic Programming

We present a convex cone program to infer the latent probability matrix ...
research
03/29/2020

High-dimensional Neural Feature using Rectified Linear Unit and Random Matrix Instance

We design a ReLU-based multilayer neural network to generate a rich high...
research
07/22/2023

Sparse Index Tracking: Simultaneous Asset Selection and Capital Allocation via ℓ_0-Constrained Portfolio

Sparse index tracking is one of the prominent passive portfolio manageme...
research
06/06/2015

Classification and regression using an outer approximation projection-gradient method

This paper deals with sparse feature selection and grouping for classifi...
research
02/21/2015

Regularization and Kernelization of the Maximin Correlation Approach

Robust classification becomes challenging when each class consists of mu...

Please sign up or login with your details

Forgot password? Click here to reset