Easy Batch Normalization

07/18/2022
by   Arip Asadulaev, et al.
0

It was shown that adversarial examples improve object recognition. But what about their opposite side, easy examples? Easy examples are samples that the machine learning model classifies correctly with high confidence. In our paper, we are making the first step toward exploring the potential benefits of using easy examples in the training procedure of neural networks. We propose to use an auxiliary batch normalization for easy examples for the standard and robust accuracy improvement.

READ FULL TEXT
research
08/13/2021

Datasets for Studying Generalization from Easy to Hard Examples

We describe new datasets for studying generalization from easy to hard e...
research
04/26/2022

On Fragile Features and Batch Normalization in Adversarial Training

Modern deep learning architecture utilize batch normalization (BN) to st...
research
05/27/2018

Towards a Theoretical Understanding of Batch Normalization

Normalization techniques such as Batch Normalization have been applied v...
research
02/10/2017

Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models

Batch Normalization is quite effective at accelerating and improving the...
research
05/24/2017

Properties of Normalization for a math based intermediate representation

The Normalization transformation plays a key role in the compilation of ...
research
08/03/2023

Hard Adversarial Example Mining for Improving Robust Fairness

Adversarial training (AT) is widely considered the state-of-the-art tech...
research
02/21/2018

Batch Normalization and the impact of batch structure on the behavior of deep convolution networks

Batch normalization was introduced in 2015 to speed up training of deep ...

Please sign up or login with your details

Forgot password? Click here to reset