DeepAI AI Chat
Log In Sign Up

Using Witten Laplacians to locate index-1 saddle points

by   Tony Lelièvre, et al.

We introduce a new stochastic algorithm to locate the index-1 saddle points of a function V:ℝ^d →ℝ, with d possibly large. This algorithm can be seen as an equivalent of the stochastic gradient descent which is a natural stochastic process to locate local minima. It relies on two ingredients: (i) the concentration properties on index-1 saddle points of the first eigenmodes of the Witten Laplacian (associated with V) on 1-forms and (ii) a probabilistic representation of a partial differential equation involving this differential operator. Numerical examples on simple molecular systems illustrate the efficacy of the proposed approach.


page 8

page 11


Robust a posteriori estimates for the stochastic Cahn-Hilliard equation

We derive a posteriori error estimates for a fully discrete finite eleme...

Localized orthogonal decomposition for a multiscale parabolic stochastic partial differential equation

A multiscale method is proposed for a parabolic stochastic partial diffe...

Efficient algorithms for solving the p-Laplacian in polynomial time

The p-Laplacian is a nonlinear partial differential equation, parametriz...

Laplacian Smoothing Gradient Descent

We propose a very simple modification of gradient descent and stochastic...

Stochastic Gradient Descent applied to Least Squares regularizes in Sobolev spaces

We study the behavior of stochastic gradient descent applied to Ax -b _2...

A PDE approach to centroidal tessellations of domains

We introduce a class of systems of Hamilton-Jacobi equations that charac...