Implicit Semantic Data Augmentation for Deep Networks

09/26/2019
by   Yulin Wang, et al.
16

In this paper, we propose a novel implicit semantic data augmentation (ISDA) approach to complement traditional augmentation techniques like flipping, translation or rotation. Our work is motivated by the intriguing property that deep networks are surprisingly good at linearizing features, such that certain directions in the deep feature space correspond to meaningful semantic transformations, e.g., adding sunglasses or changing backgrounds. As a consequence, translating training samples along many semantic directions in the feature space can effectively augment the dataset to improve generalization. To implement this idea effectively and efficiently, we first perform an online estimate of the covariance matrix of deep features for each class, which captures the intra-class semantic variations. Then random vectors are drawn from a zero-mean normal distribution with the estimated covariance to augment the training data in that class. Importantly, instead of augmenting the samples explicitly, we can directly minimize an upper bound of the expected cross-entropy (CE) loss on the augmented training set, leading to a highly efficient algorithm. In fact, we show that the proposed ISDA amounts to minimizing a novel robust CE loss, which adds negligible extra computational cost to a normal training procedure. Although being simple, ISDA consistently improves the generalization performance of popular deep models (ResNets and DenseNets) on a variety of datasets, e.g., CIFAR-10, CIFAR-100 and ImageNet. Code for reproducing our results are available at https://github.com/blackfeather-wang/ISDA-for-Deep-Networks.

READ FULL TEXT

page 8

page 12

research
07/21/2020

Regularizing Deep Networks with Semantic Data Augmentation

Data augmentation is widely known as a simple yet surprisingly effective...
research
04/15/2018

Semantic Feature Augmentation in Few-shot Learning

A fundamental problem with few-shot learning is the scarcity of data in ...
research
06/13/2019

CoopSubNet: Cooperating Subnetwork for Data-Driven Regularization of Deep Networks under Limited Training Budgets

Deep networks are an integral part of the current machine learning parad...
research
03/23/2021

Transferable Semantic Augmentation for Domain Adaptation

Domain adaptation has been widely explored by transferring the knowledge...
research
03/23/2021

MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition

Real-world training data usually exhibits long-tailed distribution, wher...
research
02/07/2020

Data augmentation with Möbius transformations

Data augmentation has led to substantial improvements in the performance...
research
03/25/2021

Orthogonal Projection Loss

Deep neural networks have achieved remarkable performance on a range of ...

Please sign up or login with your details

Forgot password? Click here to reset