Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs

06/11/2020
by   Jea-Hyun Park, et al.
0

We analyze preconditioned Nesterov's accelerated gradient descent methods (PAGD) for approximating the minimizer of locally Lipschitz smooth, strongly convex objective functionals. To facilitate our analysis, we introduce a second-order ordinary differential equation (ODE) and demonstrate that this ODE is the limiting case of PAGD as the step size tends to zero. Using a simple energy argument, we show an exponential convergence of the ODE solution to its steady state. The PAGD method may be viewed as an explicit-type time-discretization scheme of the ODE system, which requires a natural time step restriction for energy stability. Assuming this restriction, an exponential rate of convergence of the PAGD sequence is demonstrated, by mimicking the convergence of the solution to the ODE via energy methods. Application of the PAGD method is made in the context of solving certain nonlinear elliptic PDE using pseudo-spectral methods, and several numerical experiments are conducted. The results confirm the global geometric and h-independent convergence of the PAGD method, with an accelerated rate that is improved over the preconditioned gradient descent (PGD) method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2017

Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

We provide a novel accelerated first-order method that achieves the asym...
research
05/01/2018

Direct Runge-Kutta Discretization Achieves Acceleration

We study gradient-based optimization methods obtained by directly discre...
research
04/07/2015

From Averaging to Acceleration, There is Only a Step-size

We show that accelerated gradient descent, averaged gradient descent and...
research
08/11/2021

Normalized Wolfe-Powell-type Local Minimax Method for Finding Multiple Unstable Solutions of Nonlinear Elliptic PDEs

The local minimax method (LMM) proposed in [Y. Li and J. Zhou, SIAM J. S...
research
09/30/2018

Accelerated PDE's for efficient solution of regularized inversion problems

We further develop a new framework, called PDE Acceleration, by applying...
research
02/22/2021

Energy stable arbitrary order ETD-MS method for gradient flows with Lipschitz nonlinearity

We present a methodology to construct efficient high-order in time accur...
research
12/04/2019

Exponential convergence of Sobolev gradient descent for a class of nonlinear eigenproblems

We propose to use the Łojasiewicz inequality as a general tool for analy...

Please sign up or login with your details

Forgot password? Click here to reset