DeepAI
Log In Sign Up

Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs

06/11/2020
by   Jea-Hyun Park, et al.
0

We analyze preconditioned Nesterov's accelerated gradient descent methods (PAGD) for approximating the minimizer of locally Lipschitz smooth, strongly convex objective functionals. To facilitate our analysis, we introduce a second-order ordinary differential equation (ODE) and demonstrate that this ODE is the limiting case of PAGD as the step size tends to zero. Using a simple energy argument, we show an exponential convergence of the ODE solution to its steady state. The PAGD method may be viewed as an explicit-type time-discretization scheme of the ODE system, which requires a natural time step restriction for energy stability. Assuming this restriction, an exponential rate of convergence of the PAGD sequence is demonstrated, by mimicking the convergence of the solution to the ODE via energy methods. Application of the PAGD method is made in the context of solving certain nonlinear elliptic PDE using pseudo-spectral methods, and several numerical experiments are conducted. The results confirm the global geometric and h-independent convergence of the PAGD method, with an accelerated rate that is improved over the preconditioned gradient descent (PGD) method.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/14/2017

Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

We provide a novel accelerated first-order method that achieves the asym...
05/01/2018

Direct Runge-Kutta Discretization Achieves Acceleration

We study gradient-based optimization methods obtained by directly discre...
09/30/2018

Accelerated PDE's for efficient solution of regularized inversion problems

We further develop a new framework, called PDE Acceleration, by applying...
08/11/2021

Normalized Wolfe-Powell-type Local Minimax Method for Finding Multiple Unstable Solutions of Nonlinear Elliptic PDEs

The local minimax method (LMM) proposed in [Y. Li and J. Zhou, SIAM J. S...
04/07/2015

From Averaging to Acceleration, There is Only a Step-size

We show that accelerated gradient descent, averaged gradient descent and...
02/22/2021

Energy stable arbitrary order ETD-MS method for gradient flows with Lipschitz nonlinearity

We present a methodology to construct efficient high-order in time accur...
12/13/2017

Potential-Function Proofs for First-Order Methods

This note discusses proofs for convergence of first-order methods based ...