Margin-Based Regularization and Selective Sampling in Deep Neural Networks

09/13/2020
by   Berry Weinstein, et al.
0

We derive a new margin-based regularization formulation, termed multi-margin regularization (MMR), for deep neural networks (DNNs). The MMR is inspired by principles that were applied in margin analysis of shallow linear classifiers, e.g., support vector machine (SVM). Unlike SVM, MMR is continuously scaled by the radius of the bounding sphere (i.e., the maximal norm of the feature vector in the data), which is constantly changing during training. We empirically demonstrate that by a simple supplement to the loss function, our method achieves better results on various classification tasks across domains. Using the same concept, we also derive a selective sampling scheme and demonstrate accelerated training of DNNs by selecting samples according to a minimal margin score (MMS). This score measures the minimal amount of displacement an input should undergo until its predicted classification is switched. We evaluate our proposed methods on three image classification tasks and six language text classification tasks. Specifically, we show improved empirical results on CIFAR10, CIFAR100 and ImageNet using state-of-the-art convolutional neural networks (CNNs) and BERT-BASE architecture for the MNLI, QQP, QNLI, MRPC, SST-2 and RTE benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2019

Selective sampling for accelerating training of deep neural networks

We present a selective sampling method designed to accelerate the traini...
research
11/03/2018

Radius-margin bounds for deep neural networks

Explaining the unreasonable effectiveness of deep learning has eluded re...
research
10/09/2021

Pairwise Margin Maximization for Deep Neural Networks

The weight decay regularization term is widely used during training to c...
research
10/09/2018

Average Margin Regularization for Classifiers

Adversarial robustness has become an important research topic given empi...
research
03/29/2021

Selective Output Smoothing Regularization: Regularize Neural Networks by Softening Output Distributions

In this paper, we propose Selective Output Smoothing Regularization, a n...
research
06/17/2022

Stop Overcomplicating Selective Classification: Use Max-Logit

We tackle the problem of Selective Classification where the goal is to a...
research
03/31/2017

Intraoperative margin assessment of human breast tissue in optical coherence tomography images using deep neural networks

Objective: In this work, we perform margin assessment of human breast ti...

Please sign up or login with your details

Forgot password? Click here to reset