Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training

06/26/2018
by   Sahar Qaadan, et al.
0

Budgeted Stochastic Gradient Descent (BSGD) is a state-of-the-art technique for training large-scale kernelized support vector machines. The budget constraint is maintained incrementally by merging two points whenever the pre-defined budget is exceeded. The process of finding suitable merge partners is costly; it can account for up to 45 paper we investigate computationally more efficient schemes that merge more than two points at once. We obtain significant speed-ups without sacrificing accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2018

Speeding Up Budgeted Stochastic Gradient Descent SVM Training with Precomputed Golden Section Search

Limiting the model size of a kernel support vector machine to a pre-defi...
research
05/03/2019

Performance Optimization on Model Synchronization in Parallel Stochastic Gradient Descent Based SVM

Understanding the bottlenecks in implementing stochastic gradient descen...
research
06/20/2023

Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent

Gaussian processes are a powerful framework for quantifying uncertainty ...
research
12/03/2020

SSGD: A safe and efficient method of gradient descent

With the vigorous development of artificial intelligence technology, var...
research
07/03/2018

On the Computational Power of Online Gradient Descent

We prove that the evolution of weight vectors in online gradient descent...
research
03/30/2017

Diving into the shallows: a computational perspective on large-scale shallow learning

In this paper we first identify a basic limitation in gradient descent-b...
research
01/26/2020

LiteMORT: A memory efficient gradient boosting tree system on adaptive compact distributions

Gradient boosted decision trees (GBDT) is the leading algorithm for many...

Please sign up or login with your details

Forgot password? Click here to reset