Understanding Overfitting in Adversarial Training via Kernel Regression

04/13/2023
by   Teng Zhang, et al.
0

Adversarial training and data augmentation with noise are widely adopted techniques to enhance the performance of neural networks. This paper investigates adversarial training and data augmentation with noise in the context of regularized regression in a reproducing kernel Hilbert space (RKHS). We establish the limiting formula for these techniques as the attack and noise size, as well as the regularization parameter, tend to zero. Based on this limiting formula, we analyze specific scenarios and demonstrate that, without appropriate regularization, these two methods may have larger generalization error and Lipschitz constant than standard kernel regression. However, by selecting the appropriate regularization parameter, these two methods can outperform standard kernel regression and achieve smaller generalization error and Lipschitz constant. These findings support the empirical observations that adversarial training can lead to overfitting, and appropriate regularization methods, such as early stopping, can alleviate this issue.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2023

Revisiting and Exploring Efficient Fast Adversarial Training via LAW: Lipschitz Regularization and Auto Weight Averaging

Fast Adversarial Training (FAT) not only improves the model robustness b...
research
07/22/2019

Understanding Adversarial Robustness Through Loss Landscape Geometries

The pursuit of explaining and improving generalization in deep learning ...
research
10/03/2019

Partial differential equation regularization for supervised machine learning

This article is an overview of supervised machine learning problems for ...
research
03/30/2021

Enabling Data Diversity: Efficient Automatic Augmentation via Regularized Adversarial Training

Data augmentation has proved extremely useful by increasing training dat...
research
06/21/2023

Adversarial Training with Generated Data in High-Dimensional Regression: An Asymptotic Study

In recent years, studies such as <cit.> have demonstrated that incorpora...
research
06/10/2020

On Mixup Regularization

Mixup is a data augmentation technique that creates new examples as conv...
research
04/16/2011

Adding noise to the input of a model trained with a regularized objective

Regularization is a well studied problem in the context of neural networ...

Please sign up or login with your details

Forgot password? Click here to reset