Data Interpolating Prediction: Alternative Interpretation of Mixup

06/20/2019
by   Takuya Shimada, et al.
4

Data augmentation by mixing samples, such as Mixup, has widely been used typically for classification tasks. However, this strategy is not always effective due to the gap between augmented samples for training and original samples for testing. This gap may prevent a classifier from learning the optimal decision boundary and increase the generalization error. To overcome this problem, we propose an alternative framework called Data Interpolating Prediction (DIP). Unlike common data augmentations, we encapsulate the sample-mixing process in the hypothesis class of a classifier so that train and test samples are treated equally. We derive the generalization bound and show that DIP helps to reduce the original Rademacher complexity. Also, we empirically demonstrate that DIP can outperform existing Mixup.

READ FULL TEXT
research
10/29/2020

Self-paced Data Augmentation for Training Neural Networks

Data augmentation is widely used for machine learning; however, an effec...
research
03/14/2023

DualMix: Unleashing the Potential of Data Augmentation for Online Class-Incremental Learning

Online Class-Incremental (OCI) learning has sparked new approaches to ex...
research
10/14/2019

Rethinking Data Augmentation: Self-Supervision and Self-Distillation

Data augmentation techniques, e.g., flipping or cropping, which systemat...
research
09/19/2019

Data Augmentation Revisited: Rethinking the Distribution Gap between Clean and Augmented Data

Data augmentation has been widely applied as an effective methodology to...
research
11/20/2022

Feature Weaken: Vicinal Data Augmentation for Classification

Deep learning usually relies on training large-scale data samples to ach...
research
08/02/2021

Generalization bounds for nonparametric regression with β-mixing samples

In this paper we present a series of results that permit to extend in a ...
research
05/17/2023

Infinite Class Mixup

Mixup is a widely adopted strategy for training deep networks, where add...

Please sign up or login with your details

Forgot password? Click here to reset