One-pass Stochastic Gradient Descent in Overparametrized Two-layer Neural Networks

05/01/2021
by   Jiaming Xu, et al.
0

There has been a recent surge of interest in understanding the convergence of gradient descent (GD) and stochastic gradient descent (SGD) in overparameterized neural networks. Most previous works assume that the training data is provided a priori in a batch, while less attention has been paid to the important setting where the training data arrives in a stream. In this paper, we study the streaming data setup and show that with overparamterization and random initialization, the prediction error of two-layer neural networks under one-pass SGD converges in expectation. The convergence rate depends on the eigen-decomposition of the integral operator associated with the so-called neural tangent kernel (NTK). A key step of our analysis is to show a random kernel function converges to the NTK with high probability using the VC dimension and McDiarmid's inequality.

READ FULL TEXT
research
06/22/2020

Optimal Rates for Averaged Stochastic Gradient Descent under Neural Tangent Kernel Regime

We analyze the convergence of the averaged stochastic gradient descent f...
research
07/28/2022

One-Pass Learning via Bridging Orthogonal Gradient Descent and Recursive Least-Squares

While deep neural networks are capable of achieving state-of-the-art per...
research
06/01/2021

The Gaussian equivalence of generative models for learning with shallow neural networks

Understanding the impact of data structure on the computational tractabi...
research
03/26/2018

A Provably Correct Algorithm for Deep Learning that Actually Works

We describe a layer-by-layer algorithm for training deep convolutional n...
research
12/13/2014

The Statistics of Streaming Sparse Regression

We present a sparse analogue to stochastic gradient descent that is guar...
research
05/01/2022

Ridgeless Regression with Random Features

Recent theoretical studies illustrated that kernel ridgeless regression ...

Please sign up or login with your details

Forgot password? Click here to reset