Variance Suppression: Balanced Training Process in Deep Learning

11/20/2018
by   Tao Yi, et al.
0

Stochastic gradient descent updates parameters with summation gradient computed from a random data batch. This summation will lead to unbalanced training process if the data we obtained is unbalanced. To address this issue, this paper takes the error variance and error mean both into consideration. The adaptively adjusting approach of two terms trading off is also given in our algorithm. Due to this algorithm can suppress error variance, we named it Variance Suppression Gradient Descent (VSSGD). Experimental results have demonstrated that VSSGD can accelerate the training process, effectively prevent overfitting, improve the networks learning capacity from small samples.

READ FULL TEXT
research
03/22/2016

Trading-off variance and complexity in stochastic gradient descent

Stochastic gradient descent is the method of choice for large-scale mach...
research
04/12/2018

Asynch-SGBDT: Asynchronous Parallel Stochastic Gradient Boosting Decision Tree based on Parameters Server

Gradient Boosting Decision Tree, i.e. GBDT, becomes one of the most impo...
research
04/12/2018

Asynchronous Parallel Sampling Gradient Boosting Decision Tree

With the development of big data technology, Gradient Boosting Decision ...
research
07/25/2020

Variance Reduction for Deep Q-Learning using Stochastic Recursive Gradient

Deep Q-learning algorithms often suffer from poor gradient estimations w...
research
07/19/2019

Surfing: Iterative optimization over incrementally trained deep networks

We investigate a sequential optimization procedure to minimize the empir...
research
06/08/2022

On gradient descent training under data augmentation with on-line noisy copies

In machine learning, data augmentation (DA) is a technique for improving...
research
05/24/2016

Learning a Metric Embedding for Face Recognition using the Multibatch Method

This work is motivated by the engineering task of achieving a near state...

Please sign up or login with your details

Forgot password? Click here to reset