How Important is Importance Sampling for Deep Budgeted Training?

10/27/2021
by   Eric Arazo, et al.
4

Long iterative training processes for Deep Neural Networks (DNNs) are commonly required to achieve state-of-the-art performance in many computer vision tasks. Importance sampling approaches might play a key role in budgeted training regimes, i.e. when limiting the number of training iterations. These approaches aim at dynamically estimating the importance of each sample to focus on the most relevant and speed up convergence. This work explores this paradigm and how a budget constraint interacts with importance sampling approaches and data augmentation techniques. We show that under budget restrictions, importance sampling approaches do not provide a consistent improvement over uniform sampling. We suggest that, given a specific budget, the best course of action is to disregard the importance and introduce adequate data augmentation; e.g. when reducing the budget to a 30 maintains accuracy, while importance sampling does not. We conclude from our work that DNNs under budget restrictions benefit greatly from variety in the training set and that finding the right samples to train on is not the most effective strategy when balancing high performance with low computational requirements. Source code available at https://git.io/JKHa3 .

READ FULL TEXT
research
03/29/2023

Importance Sampling for Stochastic Gradient Descent in Deep Neural Networks

Stochastic gradient descent samples uniformly the training set to build ...
research
09/22/2020

Bayesian Update with Importance Sampling: Required Sample Size

Importance sampling is used to approximate Bayes' rule in many computati...
research
02/06/2016

Importance Sampling for Minibatches

Minibatching is a very well studied and highly popular technique in supe...
research
07/10/2023

Importance Sampling for Minimization of Tail Risks: A Tutorial

This paper provides an introductory overview of how one may employ impor...
research
03/02/2018

Not All Samples Are Created Equal: Deep Learning with Importance Sampling

Deep neural network training spends most of the computation on examples ...
research
01/08/2019

Comparing Sample-wise Learnability Across Deep Neural Network Models

Estimating the relative importance of each sample in a training set has ...
research
02/07/2022

SODA: Self-organizing data augmentation in deep neural networks – Application to biomedical image segmentation tasks

In practice, data augmentation is assigned a predefined budget in terms ...

Please sign up or login with your details

Forgot password? Click here to reset