Exponential convergence of Sobolev gradient descent for a class of nonlinear eigenproblems

12/04/2019
by   Ziyun Zhang, et al.
0

We propose to use the Łojasiewicz inequality as a general tool for analyzing the convergence rate of gradient descent on a Hilbert manifold, without resorting to the continuous gradient flow. Using this tool, we show that a Sobolev gradient descent method with adaptive inner product converges exponentially fast to the ground state for the Gross-Pitaevskii eigenproblem. This method can be extended to a class of general high-degree optimizations or nonlinear eigenproblems under certain conditions. We demonstrate this generalization by several examples, in particular a nonlinear Schrödinger eigenproblem with an extra high-order interaction term. Numerical experiments are presented for these problems.

READ FULL TEXT

page 19

page 21

research
02/16/2016

Gradient Descent Converges to Minimizers

We show that gradient descent converges to a local minimizer, almost sur...
research
01/06/2020

Gradient descent algorithms for Bures-Wasserstein barycenters

We study first order methods to compute the barycenter of a probability ...
research
09/20/2021

Generalized Optimization: A First Step Towards Category Theoretic Learning Theory

The Cartesian reverse derivative is a categorical generalization of reve...
research
12/06/2022

Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models

Multilevel Stein variational gradient descent is a method for particle-b...
research
04/21/2020

AdaX: Adaptive Gradient Descent with Exponential Long Term Memory

Although adaptive optimization algorithms such as Adam show fast converg...
research
07/12/2023

Provably Faster Gradient Descent via Long Steps

This work establishes provably faster convergence rates for gradient des...

Please sign up or login with your details

Forgot password? Click here to reset