Adversarial Training with Generated Data in High-Dimensional Regression: An Asymptotic Study

06/21/2023
by   Yue Xing, et al.
0

In recent years, studies such as <cit.> have demonstrated that incorporating additional real or generated data with pseudo-labels can enhance adversarial training through a two-stage training approach. In this paper, we perform a theoretical analysis of the asymptotic behavior of this method in high-dimensional linear regression. While a double-descent phenomenon can be observed in ridgeless training, with an appropriate ℒ_2 regularization, the two-stage adversarial training achieves a better performance. Finally, we derive a shortcut cross-validation formula specifically tailored for the two-stage training method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset