Learning One-hidden-layer ReLU Networks via Gradient Descent

06/20/2018
by   Xiao Zhang, et al.
2

We study the problem of learning one-hidden-layer neural networks with Rectified Linear Unit (ReLU) activation function, where the inputs are sampled from standard Gaussian distribution and the outputs are generated from a noisy teacher network. We analyze the performance of gradient descent for training such kind of neural networks based on empirical risk minimization, and provide algorithm-dependent guarantees. In particular, we prove that tensor initialization followed by gradient descent can converge to the ground-truth parameters at a linear rate up to some statistical error. To the best of our knowledge, this is the first work characterizing the recovery guarantee for practical learning of one-hidden-layer ReLU networks with multiple neurons. Numerical experiments verify our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2017

Globally Optimal Gradient Descent for a ConvNet with Gaussian Inputs

Deep learning models are often successfully trained using gradient desce...
research
09/18/2017

When is a Convolutional Filter Easy To Learn?

We analyze the convergence of (stochastic) gradient descent algorithm fo...
research
07/09/2020

Learning Over-Parametrized Two-Layer ReLU Neural Networks beyond NTK

We consider the dynamic of gradient descent for learning a two-layer neu...
research
08/04/2022

Agnostic Learning of General ReLU Activation Using Gradient Descent

We provide a convergence analysis of gradient descent for the problem of...
research
06/20/2018

Learning ReLU Networks via Alternating Minimization

We propose and analyze a new family of algorithms for training neural ne...
research
07/17/2019

On the geometry of solutions and on the capacity of multi-layer neural networks with ReLU activations

Rectified Linear Units (ReLU) have become the main model for the neural ...
research
10/31/2016

Tensor Switching Networks

We present a novel neural network algorithm, the Tensor Switching (TS) n...

Please sign up or login with your details

Forgot password? Click here to reset