Epsilon Consistent Mixup: An Adaptive Consistency-Interpolation Tradeoff
In this paper we propose ϵ-Consistent Mixup (ϵmu). ϵmu is a data-based structural regularization technique that combines Mixup's linear interpolation with consistency regularization in the Mixup direction, by compelling a simple adaptive tradeoff between the two. This learnable combination of consistency and interpolation induces a more flexible structure on the evolution of the response across the feature space and is shown to improve semi-supervised classification accuracy on the SVHN and CIFAR10 benchmark datasets, yielding the largest gains in the most challenging low label-availability scenarios. Empirical studies comparing ϵmu and Mixup are presented and provide insight into the mechanisms behind ϵmu's effectiveness. In particular, ϵmu is found to produce more accurate synthetic labels and more confident predictions than Mixup.
READ FULL TEXT