Stability of the Stochastic Gradient Method for an Approximated Large Scale Kernel Machine

04/21/2018
by   Aven Samareh, et al.
0

In this paper we measured the stability of stochastic gradient method (SGM) for learning an approximated Fourier primal support vector machine. The stability of an algorithm is considered by measuring the generalization error in terms of the absolute difference between the test and the training error. Our problem is to learn an approximated kernel function using random Fourier features for a binary classification problem via online convex optimization settings. For a convex, Lipschitz continuous and smooth loss function, given reasonable number of iterations stochastic gradient method is stable. We showed that with a high probability SGM generalizes well for an approximated kernel under given assumptions.We empirically verified the theoretical findings for different parameters using several data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2015

Train faster, generalize better: Stability of stochastic gradient descent

We show that parametric models trained by a stochastic gradient method (...
research
12/13/2018

Tight Analyses for Non-Smooth Stochastic Gradient Descent

Consider the problem of minimizing functions that are Lipschitz and stro...
research
06/29/2017

Feature uncertainty bounding schemes for large robust nonlinear SVM classifiers

We consider the binary classification problem when data are large and su...
research
07/26/2019

Scalable Semi-Supervised SVM via Triply Stochastic Gradients

Semi-supervised learning (SSL) plays an increasingly important role in t...
research
05/01/2022

Ridgeless Regression with Random Features

Recent theoretical studies illustrated that kernel ridgeless regression ...
research
06/26/2018

Speeding Up Budgeted Stochastic Gradient Descent SVM Training with Precomputed Golden Section Search

Limiting the model size of a kernel support vector machine to a pre-defi...
research
03/16/2022

A Multi-parameter Updating Fourier Online Gradient Descent Algorithm for Large-scale Nonlinear Classification

Large scale nonlinear classification is a challenging task in the field ...

Please sign up or login with your details

Forgot password? Click here to reset