Noise-induced degeneration in online learning

08/24/2020
by   Yuzuru Sato, et al.
0

In order to elucidate the plateau phenomena caused by vanishing gradient, we herein analyse stability of stochastic gradient descent dynamics near degenerated subspaces in a multi-layer perceptron. We show that, in Fukumizu-Amari model, attracting regions exist in the degenerated subspace, and a novel type of strong plateau phenomenon emerges as a noise-induced phenomenon, which makes learning much slower than the deterministic gradient descent dynamics. The noise-induced degeneration observed herein is expected to be found in a broad class of online learning in perceptrons.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2021

Decreasing scaling transition from adaptive gradient descent to stochastic gradient descent

Currently, researchers have proposed the adaptive gradient descent algor...
research
02/06/2023

Stochastic Gradient Descent-induced drift of representation in a two-layer neural network

Representational drift refers to over-time changes in neural activation ...
research
06/26/2018

A Tight Convergence Analysis for Stochastic Gradient Descent with Delayed Updates

We provide tight finite-time convergence bounds for gradient descent and...
research
01/18/2021

Screening for Sparse Online Learning

Sparsity promoting regularizers are widely used to impose low-complexity...
research
10/07/2018

Online Center of Mass Estimation for a Humanoid Wheeled Inverted Pendulum Robot

We present a novel application of robust control and online learning for...
research
01/16/2015

Stochastic Gradient Based Extreme Learning Machines For Online Learning of Advanced Combustion Engines

In this article, a stochastic gradient based online learning algorithm f...
research
10/30/2018

An Online-Learning Approach to Inverse Optimization

In this paper, we demonstrate how to learn the objective function of a d...

Please sign up or login with your details

Forgot password? Click here to reset