Achieving High Accuracy with PINNs via Energy Natural Gradients

02/25/2023
by   Johannes Müller, et al.
0

We propose energy natural gradient descent, a natural gradient method with respect to a Hessian-induced Riemannian metric as an optimization algorithm for physics-informed neural networks (PINNs) and the deep Ritz method. As a main motivation we show that the update direction in function space resulting from the energy natural gradient corresponds to the Newton direction modulo an orthogonal projection onto the model's tangent space. We demonstrate experimentally that energy natural gradient descent yields highly accurate solutions with errors several orders of magnitude smaller than what is obtained when training PINNs with standard optimizers like gradient descent or Adam, even when those are allowed significantly more computation time.

READ FULL TEXT

page 11

page 13

research
06/29/2020

Natural Gradient for Combined Loss Using Wavelets

Natural gradients have been widely used in optimization of loss function...
research
04/02/2020

Mirrorless Mirror Descent: A More Natural Discretization of Riemannian Gradient Flow

We present a direct (primal only) derivation of Mirror Descent as a "par...
research
04/24/2017

A Neural Network model with Bidirectional Whitening

We present here a new model and algorithm which performs an efficient Na...
research
02/13/2022

Efficient Natural Gradient Descent Methods for Large-Scale Optimization Problems

We propose an efficient numerical method for computing natural gradient ...
research
03/13/2020

Boosting Frank-Wolfe by Chasing Gradients

The Frank-Wolfe algorithm has become a popular first-order optimization ...
research
06/25/2021

Hessian informed mirror descent

Inspired by the recent paper (L. Ying, Mirror descent algorithms for min...
research
05/04/2023

Automatic Prompt Optimization with "Gradient Descent" and Beam Search

Large Language Models (LLMs) have shown impressive performance as genera...

Please sign up or login with your details

Forgot password? Click here to reset