Support Vector Machines for Additive Models: Consistency and Robustness

07/23/2010
by   Andreas Christmann, et al.
0

Support vector machines (SVMs) are special kernel based methods and belong to the most successful learning methods since more than a decade. SVMs can informally be described as a kind of regularized M-estimators for functions and have demonstrated their usefulness in many complicated real-life problems. During the last years a great part of the statistical research on SVMs has concentrated on the question how to design SVMs such that they are universally consistent and statistically robust for nonparametric classification or nonparametric regression purposes. In many applications, some qualitative prior knowledge of the distribution P or of the unknown function f to be estimated is present or the prediction function with a good interpretability is desired, such that a semiparametric model or an additive model is of interest. In this paper we mainly address the question how to design SVMs by choosing the reproducing kernel Hilbert space (RKHS) or its corresponding kernel to obtain consistent and statistically robust estimators in additive models. We give an explicit construction of kernels - and thus of their RKHSs - which leads in combination with a Lipschitz continuous loss function to consistent and statistically robust SMVs for additive models. Examples are quantile regression based on the pinball loss function, regression based on the epsilon-insensitive loss function, and classification based on the hinge loss function.

READ FULL TEXT

page 27

page 28

page 29

page 30

research
01/29/2013

On the Consistency of the Bootstrap Approach for Support Vector Machines and Related Kernel Based Methods

It is shown that bootstrap approximations of support vector machines (SV...
research
05/14/2014

Learning rates for the risk of kernel based quantile regression estimators in additive models

Additive models play an important role in semiparametric statistics. Thi...
research
10/12/2015

On the Robustness of Regularized Pairwise Learning Methods Based on Kernels

Regularized empirical risk minimization including support vector machine...
research
03/20/2012

Asymptotic Confidence Sets for General Nonparametric Regression and Classification by Regularized Kernel Methods

Regularized kernel methods such as, e.g., support vector machines and le...
research
05/25/2018

Function Estimation via Reconstruction

This paper introduces an interpolation-based method, called the reconstr...
research
11/08/2011

On the stability of bootstrap estimators

It is shown that bootstrap approximations of an estimator which is based...
research
10/06/2020

Unified Robust Estimation via the COCO

Robust estimation is concerned with how to provide reliable parameter es...

Please sign up or login with your details

Forgot password? Click here to reset