Randomized Kernel Methods for Least-Squares Support Vector Machines

03/22/2017
by   M. Andrecut, et al.
0

The least-squares support vector machine is a frequently used kernel method for non-linear regression and classification tasks. Here we discuss several approximation algorithms for the least-squares support vector machine classifier. The proposed methods are based on randomized block kernel matrices, and we show that they provide good accuracy and reliable scaling for multi-class classification problems with relatively large data sets. Also, we present several numerical experiments that illustrate the practical applicability of the proposed methods.

READ FULL TEXT
research
11/16/2016

Algebraic multigrid support vector machines

The support vector machine is a flexible optimization-based technique wi...
research
08/28/2017

Efficient Decision Trees for Multi-class Support Vector Machines Using Entropy and Generalization Error Estimation

We propose new methods for Support Vector Machines (SVMs) using tree arc...
research
01/30/2022

A least squares support vector regression for anisotropic diffusion filtering

Anisotropic diffusion filtering for signal smoothing as a low-pass filte...
research
06/04/2015

An Average Classification Algorithm

Many classification algorithms produce a classifier that is a weighted a...
research
03/27/2019

Iteratively reweighted least squares for robust regression via SVM and ELM

The measure of most robust machine learning methods is reweighted. To ov...
research
12/06/2007

Kernels and Ensembles: Perspectives on Statistical Learning

Since their emergence in the 1990's, the support vector machine and the ...
research
06/07/2018

Scalable Multi-Class Bayesian Support Vector Machines for Structured and Unstructured Data

We introduce a new Bayesian multi-class support vector machine by formul...

Please sign up or login with your details

Forgot password? Click here to reset