Learning a Single Neuron with Bias Using Gradient Descent

06/02/2021
by   Gal Vardi, et al.
0

We theoretically study the fundamental problem of learning a single neuron with a bias term (𝐱↦σ(<𝐰,𝐱> + b)) in the realizable setting with the ReLU activation, using gradient descent. Perhaps surprisingly, we show that this is a significantly different and more challenging problem than the bias-less case (which was the focus of previous works on single neurons), both in terms of the optimization geometry as well as the ability of gradient methods to succeed in some scenarios. We provide a detailed study of this problem, characterizing the critical points of the objective, demonstrating failure cases, and providing positive convergence guarantees under different sets of assumptions. To prove our results, we develop some tools which may be of independent interest, and improve previous results on learning single neurons.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/15/2020

Learning a Single Neuron with Gradient Methods

We consider the fundamental problem of learning a single neuron x σ(w^ x...
08/04/2022

Agnostic Learning of General ReLU Activation Using Gradient Descent

We provide a convergence analysis of gradient descent for the problem of...
05/29/2020

Agnostic Learning of a Single Neuron with Gradient Descent

We consider the problem of learning the best-fitting single neuron as me...
10/04/2020

Understanding How Over-Parametrization Leads to Acceleration: A case of learning a single teacher neuron

Over-parametrization has become a popular technique in deep learning. It...
03/25/2019

Learning-to-Learn Stochastic Gradient Descent with Biased Regularization

We study the problem of learning-to-learn: inferring a learning algorith...
07/14/2020

Plateau Phenomenon in Gradient Descent Training of ReLU networks: Explanation, Quantification and Avoidance

The ability of neural networks to provide `best in class' approximation ...
06/08/2022

On Gradient Descent Convergence beyond the Edge of Stability

Gradient Descent (GD) is a powerful workhorse of modern machine learning...