Training Generative Adversarial Networks by Solving Ordinary Differential Equations

10/28/2020
by   Chongli Qin, et al.
21

The instability of Generative Adversarial Network (GAN) training has frequently been attributed to gradient descent. Consequently, recent methods have aimed to tailor the models and training procedures to stabilise the discrete updates. In contrast, we study the continuous-time dynamics induced by GAN training. Both theory and toy experiments suggest that these dynamics are in fact surprisingly stable. From this perspective, we hypothesise that instabilities in training GANs arise from the integration error in discretising the continuous dynamics. We experimentally verify that well-known ODE solvers (such as Runge-Kutta) can stabilise training - when combined with a regulariser that controls the integration error. Our approach represents a radical departure from previous methods which typically use adaptive optimisation and stabilisation techniques that constrain the functional space (e.g. Spectral Normalisation). Evaluation on CIFAR-10 and ImageNet shows that our method outperforms several strong baselines, demonstrating its efficacy.

READ FULL TEXT

page 6

page 18

page 19

research
02/08/2021

Functional Space Analysis of Local GAN Convergence

Recent work demonstrated the benefits of studying continuous-time dynami...
research
05/26/2019

ODE Analysis of Stochastic Gradient Methods with Optimism and Anchoring for Minimax Problems and GANs

Despite remarkable empirical success, the training dynamics of generativ...
research
04/22/2020

Stabilizing Training of Generative Adversarial Nets via Langevin Stein Variational Gradient Descent

Generative adversarial networks (GANs), famous for the capability of lea...
research
12/18/2020

Convergence dynamics of Generative Adversarial Networks: the dual metric flows

Fitting neural networks often resorts to stochastic (or similar) gradien...
research
12/02/2019

LOGAN: Latent Optimisation for Generative Adversarial Networks

Training generative adversarial networks requires balancing of delicate ...
research
07/04/2021

Learning ODEs via Diffeomorphisms for Fast and Robust Integration

Advances in differentiable numerical integrators have enabled the use of...
research
10/21/2022

Data reconstruction of turbulent flows with Gappy POD, Extended POD and Generative Adversarial Networks

Three methods are used to reconstruct two-dimensional instantaneous velo...

Please sign up or login with your details

Forgot password? Click here to reset