Log In Sign Up

Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs

by   Jea-Hyun Park, et al.

We analyze preconditioned Nesterov's accelerated gradient descent methods (PAGD) for approximating the minimizer of locally Lipschitz smooth, strongly convex objective functionals. To facilitate our analysis, we introduce a second-order ordinary differential equation (ODE) and demonstrate that this ODE is the limiting case of PAGD as the step size tends to zero. Using a simple energy argument, we show an exponential convergence of the ODE solution to its steady state. The PAGD method may be viewed as an explicit-type time-discretization scheme of the ODE system, which requires a natural time step restriction for energy stability. Assuming this restriction, an exponential rate of convergence of the PAGD sequence is demonstrated, by mimicking the convergence of the solution to the ODE via energy methods. Application of the PAGD method is made in the context of solving certain nonlinear elliptic PDE using pseudo-spectral methods, and several numerical experiments are conducted. The results confirm the global geometric and h-independent convergence of the PAGD method, with an accelerated rate that is improved over the preconditioned gradient descent (PGD) method.


page 1

page 2

page 3

page 4


Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

We provide a novel accelerated first-order method that achieves the asym...

Direct Runge-Kutta Discretization Achieves Acceleration

We study gradient-based optimization methods obtained by directly discre...

Accelerated PDE's for efficient solution of regularized inversion problems

We further develop a new framework, called PDE Acceleration, by applying...

Normalized Wolfe-Powell-type Local Minimax Method for Finding Multiple Unstable Solutions of Nonlinear Elliptic PDEs

The local minimax method (LMM) proposed in [Y. Li and J. Zhou, SIAM J. S...

From Averaging to Acceleration, There is Only a Step-size

We show that accelerated gradient descent, averaged gradient descent and...

Energy stable arbitrary order ETD-MS method for gradient flows with Lipschitz nonlinearity

We present a methodology to construct efficient high-order in time accur...

Potential-Function Proofs for First-Order Methods

This note discusses proofs for convergence of first-order methods based ...