Optimal Rates for Learning with Nyström Stochastic Gradient Methods

by   Junhong Lin, et al.
Istituto Italiano di Tecnologia

In the setting of nonparametric regression, we propose and study a combination of stochastic gradient methods with Nyström subsampling, allowing multiple passes over the data and mini-batches. Generalization error bounds for the studied algorithm are provided. Particularly, optimal learning rates are derived considering different possible choices of the step-size, the mini-batch size, the number of iterations/passes, and the subsampling level. In comparison with state-of-the-art algorithms such as the classic stochastic gradient methods and kernel ridge regression with Nyström, the studied algorithm has advantages on the computational complexity, while achieving the same optimal learning rates. Moreover, our results indicate that using mini-batches can reduce the total computational cost while achieving the same optimal statistical results.


page 1

page 2

page 3

page 4


Optimal Convergence for Distributed Learning with Stochastic Gradient Methods and Spectral-Regularization Algorithms

We study generalization properties of distributed algorithms in the sett...

Optimal Rates for Multi-pass Stochastic Gradient Methods

We analyze the learning properties of the stochastic gradient method whe...

Learning with SGD and Random Features

Sketching and stochastic gradient methods are arguably the most common t...

The Implicit Regularization of Stochastic Gradient Flow for Least Squares

We study the implicit regularization of mini-batch stochastic gradient d...

Generalization Properties of Learning with Random Features

We study the generalization properties of ridge regression with random f...

Risk Bounds for Low Cost Bipartite Ranking

Bipartite ranking is an important supervised learning problem; however, ...

Optimal mini-batch and step sizes for SAGA

Recently it has been shown that the step sizes of a family of variance r...

Please sign up or login with your details

Forgot password? Click here to reset