Neuroevolution Surpasses Stochastic Gradient Descent for Physics-Informed Neural Networks

12/15/2022
by   Nicholas Sung Wei Yong, et al.
0

The potential of learned models for fundamental scientific research and discovery is drawing increasing attention. Physics-informed neural networks (PINNs), where the loss function directly embeds governing equations of scientific phenomena, is one of the key techniques at the forefront of recent advances. These models are typically trained using stochastic gradient descent, akin to their standard deep learning counterparts. However, in this paper, we carry out a simple analysis showing that the loss functions arising in PINNs lead to a high degree of complexity and ruggedness that may not be conducive for gradient-descent and its variants. It is therefore clear that the use of neuro-evolutionary algorithms as alternatives to gradient descent for PINNs may be a better choice. Our claim is strongly supported herein by benchmark problems and baseline results demonstrating that convergence rates achieved by neuroevolution can indeed surpass that of gradient descent for PINN training. Furthermore, implementing neuroevolution with JAX leads to orders of magnitude speedup relative to standard implementations.

READ FULL TEXT
research
02/04/2022

PSO-PINN: Physics-Informed Neural Networks Trained with Particle Swarm Optimization

Physics-informed neural networks (PINNs) have recently emerged as a prom...
research
04/23/2022

Competitive Physics Informed Networks

Physics Informed Neural Networks (PINNs) solve partial differential equa...
research
03/03/2023

Implicit Stochastic Gradient Descent for Training Physics-informed Neural Networks

Physics-informed neural networks (PINNs) have effectively been demonstra...
research
12/20/2020

Recent advances in deep learning theory

Deep learning is usually described as an experiment-driven field under c...
research
02/14/2018

L4: Practical loss-based stepsize adaptation for deep learning

We propose a stepsize adaptation scheme for stochastic gradient descent....
research
10/23/2019

Autoencoding with XCSF

Autoencoders enable data dimensionality reduction and are a key componen...
research
11/13/2017

Accelerating HPC codes on Intel(R) Omni-Path Architecture networks: From particle physics to Machine Learning

We discuss practical methods to ensure near wirespeed performance from c...

Please sign up or login with your details

Forgot password? Click here to reset