Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors

06/14/2018
by   Atsushi Nitanda, et al.
0

We consider stochastic gradient descent for binary classification problems in a reproducing kernel Hilbert space. In traditional analysis, it is known that the expected classification error converges more slowly than the expected risk even when assuming a low-noise condition on the conditional label probabilities. Consequently, the resulting rate is sublinear. Therefore, it is important to consider whether much faster convergence of the expected classification error can be achieved. In recent research, an exponential convergence rate for stochastic gradient descent was shown under a strong low-noise condition, but theoretical analysis of this was limited to the square loss function, which is somewhat inadequate for binary classification tasks. In this paper, we show an exponential convergence rate of the expected classification error in the final phase of learning for a wide class of differentiable convex loss functions under similar assumptions.

READ FULL TEXT
research
11/13/2019

Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features

Although kernel methods are widely used in many learning problems, they ...
research
12/13/2017

Exponential convergence of testing error for stochastic gradient methods

We consider binary classification problems with positive definite kernel...
research
02/25/2020

General Framework for Binary Classification on Top Samples

Many binary classification problems minimize misclassification above (or...
research
03/23/2020

A termination criterion for stochastic gradient descent for binary classification

We propose a new, simple, and computationally inexpensive termination te...
research
02/21/2023

A Log-linear Gradient Descent Algorithm for Unbalanced Binary Classification using the All Pairs Squared Hinge Loss

Receiver Operating Characteristic (ROC) curves are plots of true positiv...
research
06/22/2020

DeepTopPush: Simple and Scalable Method for Accuracy at the Top

Accuracy at the top is a special class of binary classification problems...
research
08/15/2022

Convergence Rates for Stochastic Approximation on a Boundary

We analyze the behavior of projected stochastic gradient descent focusin...

Please sign up or login with your details

Forgot password? Click here to reset