Selective sampling for accelerating training of deep neural networks

11/16/2019
by   Berry Weinstein, et al.
0

We present a selective sampling method designed to accelerate the training of deep neural networks. To this end, we introduce a novel measurement, the minimal margin score (MMS), which measures the minimal amount of displacement an input should take until its predicted classification is switched. For multi-class linear classification, the MMS measure is a natural generalization of the margin-based selection criterion, which was thoroughly studied in the binary classification setting. In addition, the MMS measure provides an interesting insight into the progress of the training process and can be useful for designing and monitoring new training regimes. Empirically we demonstrate a substantial acceleration when training commonly used deep neural network architectures for popular image classification tasks. The efficiency of our method is compared against the standard training procedures, and against commonly used selective sampling alternatives: Hard negative mining selection, and Entropy-based selection. Finally, we demonstrate an additional speedup when we adopt a more aggressive learning drop regime while using the MMS selective sampling method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2021

Pairwise Margin Maximization for Deep Neural Networks

The weight decay regularization term is widely used during training to c...
research
09/13/2020

Margin-Based Regularization and Selective Sampling in Deep Neural Networks

We derive a new margin-based regularization formulation, termed multi-ma...
research
10/02/2019

Accelerating Deep Learning by Focusing on the Biggest Losers

This paper introduces Selective-Backprop, a technique that accelerates t...
research
12/23/2022

Neural Networks beyond explainability: Selective inference for sequence motifs

Over the past decade, neural networks have been successful at making pre...
research
06/17/2022

Stop Overcomplicating Selective Classification: Use Max-Logit

We tackle the problem of Selective Classification where the goal is to a...
research
05/08/2019

AutoAssist: A Framework to Accelerate Training of Deep Neural Networks

Deep neural networks have yielded superior performance in many applicati...
research
11/19/2022

Gumbel-Softmax Selective Networks

ML models often operate within the context of a larger system that can a...

Please sign up or login with your details

Forgot password? Click here to reset