High-Dimensional Penalized Bernstein Support Vector Machines

03/16/2023
by   Rachid Kharoubi, et al.
0

The support vector machines (SVM) is a powerful classifier used for binary classification to improve the prediction accuracy. However, the non-differentiability of the SVM hinge loss function can lead to computational difficulties in high dimensional settings. To overcome this problem, we rely on Bernstein polynomial and propose a new smoothed version of the SVM hinge loss called the Bernstein support vector machine (BernSVM), which is suitable for the high dimension p >> n regime. As the BernSVM objective loss function is of the class C^2, we propose two efficient algorithms for computing the solution of the penalized BernSVM. The first algorithm is based on coordinate descent with maximization-majorization (MM) principle and the second one is IRLS-type algorithm (iterative re-weighted least squares). Under standard assumptions, we derive a cone condition and a restricted strong convexity to establish an upper bound for the weighted Lasso BernSVM estimator. Using a local linear approximation, we extend the latter result to penalized BernSVM with non convex penalties SCAD and MCP. Our bound holds with high probability and achieves a rate of order √(slog(p)/n), where s is the number of active features. Simulation studies are considered to illustrate the prediction accuracy of BernSVM to its competitors and also to compare the performance of the two algorithms in terms of computational timing and error estimation. The use of the proposed method is illustrated through analysis of three large-scale real data examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2011

Approximate Stochastic Subgradient Estimation Training for Support Vector Machines

Subgradient algorithms for training support vector machines have been qu...
research
01/06/2019

Solving large-scale L1-regularized SVMs and cousins: the surprising effectiveness of column and constraint generation

The linear Support Vector Machine (SVM) is one of the most popular binar...
research
09/28/2018

Learning Confidence Sets using Support Vector Machines

The goal of confidence-set learning in the binary classification setting...
research
07/15/2022

Support Vector Machines with the Hard-Margin Loss: Optimal Training via Combinatorial Benders' Cuts

The classical hinge-loss support vector machines (SVMs) model is sensiti...
research
02/09/2020

ℓ_0-Regularized High-dimensional Accelerated Failure Time Model

We develop a constructive approach for ℓ_0-penalized estimation in the s...
research
11/29/2018

Distributed Inference for Linear Support Vector Machine

The growing size of modern data brings many new challenges to existing s...
research
01/24/2015

Sparse Distance Weighted Discrimination

Distance weighted discrimination (DWD) was originally proposed to handle...

Please sign up or login with your details

Forgot password? Click here to reset