Near-Tight Margin-Based Generalization Bounds for Support Vector Machines

06/03/2020
by   Allan Grønlund, et al.
0

Support Vector Machines (SVMs) are among the most fundamental tools for binary classification. In its simplest formulation, an SVM produces a hyperplane separating two classes of data using the largest possible margin to the data. The focus on maximizing the margin has been well motivated through numerous generalization bounds. In this paper, we revisit and improve the classic generalization bounds in terms of margins. Furthermore, we complement our new generalization bound by a nearly matching lower bound, thus almost settling the generalization performance of SVMs in terms of margins.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2013

Local Support Vector Machines:Formulation and Analysis

We provide a formulation for Local Support Vector Machines (LSVMs) that ...
research
03/25/2020

A Unified Framework for Multiclass and Multilabel Support Vector Machines

We propose a novel integrated formulation for multiclass and multilabel ...
research
07/08/2022

Generalization-Memorization Machines

Classifying the training data correctly without over-fitting is one of t...
research
10/29/2021

Improving Generalization Bounds for VC Classes Using the Hypergeometric Tail Inversion

We significantly improve the generalization bounds for VC classes by usi...
research
11/26/2012

Random Projections for Linear Support Vector Machines

Let X be a data matrix of rank ρ, whose rows represent n points in d-dim...
research
07/02/2020

Consistent Structured Prediction with Max-Min Margin Markov Networks

Max-margin methods for binary classification such as the support vector ...
research
12/03/2007

Pac-Bayesian Supervised Classification: The Thermodynamics of Statistical Learning

This monograph deals with adaptive supervised classification, using tool...

Please sign up or login with your details

Forgot password? Click here to reset