Comparison theorems on large-margin learning

08/13/2019
by   Jun Fan, et al.
0

This paper studies binary classification problem associated with a family of loss functions called large-margin unified machines (LUM), which offers a natural bridge between distribution-based likelihood approaches and margin-based approaches. It also can overcome the so-called data piling issue of support vector machine in the high-dimension and low-sample size setting. In this paper we establish some new comparison theorems for all LUM loss functions which play a key role in the further error analysis of large-margin learning algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/10/2012

Coherence Functions with Applications in Large-Margin Classification Methods

Support vector machines (SVMs) naturally embody sparseness due to their ...
research
06/20/2014

Towards A Deeper Geometric, Analytic and Algorithmic Understanding of Margins

Given a matrix A, a linear feasibility problem (of which linear classifi...
research
01/23/2019

Large dimensional analysis of general margin based classification methods

Margin-based classifiers have been popular in both machine learning and ...
research
07/16/2020

Large scale analysis of generalization error in learning using margin based classification methods

Large-margin classifiers are popular methods for classification. We deri...
research
03/25/2021

Recent Advances in Large Margin Learning

This paper serves as a survey of recent advances in large margin trainin...
research
01/05/2019

Population-Guided Large Margin Classifier for High-Dimension Low -Sample-Size Problems

Various applications in different fields, such as gene expression analys...
research
03/29/2021

Score-oriented loss (SOL) functions

Loss functions engineering and the assessment of forecasting performance...

Please sign up or login with your details

Forgot password? Click here to reset