Approximate backwards differentiation of gradient flow

11/09/2022
by   Yushen Huang, et al.
0

The gradient flow (GF) is an ODE for which its explicit Euler's discretization is the gradient descent method. In this work, we investigate a family of methods derived from approximate implicit discretizations of (), drawing the connection between larger stability regions and less sensitive hyperparameter tuning. We focus on the implicit τ-step backwards differentiation formulas (BDFs), approximated in an inner loop with a few iterations of vanilla gradient descent, and give their convergence rate when the objective function is convex, strongly convex, or nonconvex. Numerical experiments show the wide range of effects of these different methods on extremely poorly conditioned problems, especially those brought about in training deep neural networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset