On Feature Normalization and Data Augmentation

02/25/2020
by   Boyi Li, et al.
19

Modern neural network training relies heavily on data augmentation for improved generalization. After the initial success of label-preserving augmentations, there has been a recent surge of interest in label-perturbing approaches, which combine features and labels across training samples to smooth the learned decision surface. In this paper, we propose a new augmentation method that leverages the first and second moments extracted and re-injected by feature normalization. We replace the moments of the learned features of one training image by those of another, and also interpolate the target labels. As our approach is fast, operates entirely in feature space, and mixes different signals than prior methods, one can effectively combine it with existing augmentation methods. We demonstrate its efficacy across benchmark data sets in computer vision, speech, and natural language processing, where it consistently improves the generalization performance of highly competitive baseline networks.

READ FULL TEXT
research
05/29/2018

Improved Mixed-Example Data Augmentation

In order to reduce overfitting, neural networks are typically trained wi...
research
05/30/2023

ShuffleMix: Improving Representations via Channel-Wise Shuffle of Interpolated Hidden States

Mixup style data augmentation algorithms have been widely adopted in var...
research
06/17/2021

Joining datasets via data augmentation in the label space for neural networks

Most, if not all, modern deep learning systems restrict themselves to a ...
research
10/11/2016

Multiple Instance Learning Convolutional Neural Networks for Object Recognition

Convolutional Neural Networks (CNN) have demon- strated its successful a...
research
07/12/2018

Hydranet: Data Augmentation for Regression Neural Networks

Despite recent efforts, deep learning techniques remain often heavily de...
research
07/09/2019

Positional Normalization

A widely deployed method for reducing the training time of deep neural n...
research
09/20/2023

AttentionMix: Data augmentation method that relies on BERT attention mechanism

The Mixup method has proven to be a powerful data augmentation technique...

Please sign up or login with your details

Forgot password? Click here to reset