Importance Sampling for Stochastic Gradient Descent in Deep Neural Networks

03/29/2023
by   Thibault Lahire, et al.
0

Stochastic gradient descent samples uniformly the training set to build an unbiased gradient estimate with a limited number of samples. However, at a given step of the training process, some data are more helpful than others to continue learning. Importance sampling for training deep neural networks has been widely studied to propose sampling schemes yielding better performance than the uniform sampling scheme. After recalling the theory of importance sampling for deep learning, this paper reviews the challenges inherent to this research area. In particular, we propose a metric allowing the assessment of the quality of a given sampling scheme; and we study the interplay between the sampling scheme and the optimizer used.

READ FULL TEXT
research
01/13/2014

Stochastic Optimization with Importance Sampling

Uniform sampling of training data has been commonly used in traditional ...
research
10/27/2021

How Important is Importance Sampling for Deep Budgeted Training?

Long iterative training processes for Deep Neural Networks (DNNs) are co...
research
01/08/2019

Comparing Sample-wise Learnability Across Deep Neural Network Models

Estimating the relative importance of each sample in a training set has ...
research
01/16/2013

Adaptive Importance Sampling for Estimation in Structured Domains

Sampling is an important tool for estimating large, complex sums and int...
research
08/23/2022

Parameterization-Independent Importance Sampling of Environment Maps

Environment maps with high dynamic range lighting, such as daylight sky ...
research
05/08/2019

AutoAssist: A Framework to Accelerate Training of Deep Neural Networks

Deep neural networks have yielded superior performance in many applicati...
research
10/06/2019

FIS-GAN: GAN with Flow-based Importance Sampling

Generative Adversarial Networks (GAN) training process, in most cases, a...

Please sign up or login with your details

Forgot password? Click here to reset