Log In Sign Up

Committor functions via tensor networks

by   Yian Chen, et al.

We propose a novel approach for computing committor functions, which describe transitions of a stochastic process between metastable states. The committor function satisfies a backward Kolmogorov equation, and in typical high-dimensional settings of interest, it is intractable to compute and store the solution with traditional numerical methods. By parametrizing the committor function in a matrix product state/tensor train format and using a similar representation for the equilibrium probability density, we solve the variational formulation of the backward Kolmogorov equation with linear time and memory complexity in the number of dimensions. This approach bypasses the need for sampling the equilibrium distribution, which can be difficult when the distribution has multiple modes. Numerical results demonstrate the effectiveness of the proposed method for high-dimensional problems.


page 1

page 2

page 3

page 4


A Tensor Decomposition Approach for High-Dimensional Hamilton-Jacobi-Bellman Equations

A tensor decomposition approach for the solution of high-dimensional, fu...

High-dimensional density estimation with tensorizing flow

We propose the tensorizing flow method for estimating high-dimensional p...

A robust GMRES algorithm in Tensor Train format

We consider the solution of linear systems with tensor product structure...

Solving for high dimensional committor functions using artificial neural networks

In this note we propose a method based on artificial neural network to s...

Adaptive deep density approximation for Fokker-Planck equations

In this paper we present a novel adaptive deep density approximation str...

Tensor Train for Global Optimization Problems in Robotics

The convergence of many numerical optimization techniques is highly sens...

Deep FPF: Gain function approximation in high-dimensional setting

In this paper, we present a novel approach to approximate the gain funct...