Stable rank-adaptive Dynamically Orthogonal Runge-Kutta schemes

11/15/2022
by   Aaron Charous, et al.
0

We develop two new sets of stable, rank-adaptive Dynamically Orthogonal Runge-Kutta (DORK) schemes that capture high-order curvature of the nonlinear low-rank manifold. The DORK schemes asymptotically approximate the truncated singular value decomposition at a greatly reduced cost while preserving mode continuity using newly derived retractions. We show that arbitrarily high-order optimal perturbative retractions can be obtained, and we prove that these new retractions are stable. In addition, we demonstrate that repeatedly applying retractions yields a gradient-descent algorithm on the low-rank manifold that converges geometrically when approximating a low-rank matrix. When approximating a higher-rank matrix, iterations converge linearly to the best low-rank approximation. We then develop a rank-adaptive retraction that is robust to overapproximation. Building off of these retractions, we derive two novel, rank-adaptive integration schemes that dynamically update the subspace upon which the system dynamics is projected within each time-step: the stable, optimal Dynamically Orthogonal Runge-Kutta (so-DORK) and gradient-descent Dynamically Orthogonal Runge-Kutta (gd-DORK) schemes. These integration schemes are numerically evaluated and compared on an ill-conditioned matrix differential equation, an advection-diffusion partial differential equation, and a nonlinear, stochastic reaction-diffusion partial differential equation. Results show a reduced error accumulation rate with the new stable, optimal and gradient-descent integrators. In addition, we find that rank adaptation allows for highly accurate solutions while preserving computational efficiency.

READ FULL TEXT
research
05/19/2023

Implicit low-rank Riemannian schemes for the time integration of stiff partial differential equations

We propose two implicit numerical schemes for the low-rank time integrat...
research
05/07/2023

CUR Decomposition for Scalable Rank-Adaptive Reduced-Order Modeling of Nonlinear Stochastic PDEs with Time-Dependent Bases

Time-dependent basis reduced order models (TDB ROMs) have successfully b...
research
01/09/2017

A Universal Variance Reduction-Based Catalyst for Nonconvex Low-Rank Matrix Recovery

We propose a generic framework based on a new stochastic variance-reduce...
research
05/18/2020

Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent

Low-rank matrix estimation is a canonical problem that finds numerous ap...
research
02/02/2023

The Power of Preconditioning in Overparameterized Low-Rank Matrix Sensing

We propose , a preconditioned gradient descent method to tackle the low-...
research
08/13/2020

Prediction of magnetization dynamics in a reduced dimensional feature space setting utilizing a low-rank kernel method

We establish a machine learning model for the prediction of the magnetiz...
research
09/15/2019

Minimax separation of the Cauchy kernel

We prove and apply an optimal low-rank approximation of the Cauchy kerne...

Please sign up or login with your details

Forgot password? Click here to reset