Fast rates for support vector machines using Gaussian kernels

08/14/2007
by   Ingo Steinwart, et al.
0

For binary classification we establish learning rates up to the order of n^-1 for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2013

Local Support Vector Machines:Formulation and Analysis

We provide a formulation for Local Support Vector Machines (LSVMs) that ...
research
05/04/2019

Improved Classification Rates for Localized SVMs

One of the main characteristics of localized support vector machines tha...
research
08/08/2022

Towards Weak Information Theory: Weak-Joint Typicality Decoding Using Support Vector Machines May Lead to Improved Error Exponents

In this paper, the authors report a way to use concepts from statistical...
research
01/29/2022

Error Rates for Kernel Classification under Source and Capacity Conditions

In this manuscript, we consider the problem of kernel classification und...
research
04/14/2019

Probabilistic Kernel Support Vector Machines

We propose a probabilistic enhancement of standard kernel Support Vecto...
research
09/28/2018

Learning Confidence Sets using Support Vector Machines

The goal of confidence-set learning in the binary classification setting...
research
05/20/2005

Upgrading Pulse Detection with Time Shift Properties Using Wavelets and Support Vector Machines

Current approaches in pulse detection use domain transformations so as t...

Please sign up or login with your details

Forgot password? Click here to reset