Physical Activation Functions (PAFs): An Approach for More Efficient Induction of Physics into Physics-Informed Neural Networks (PINNs)

05/29/2022
by   jassem-abbasi, et al.
19

In recent years, the gap between Deep Learning (DL) methods and analytical or numerical approaches in scientific computing is tried to be filled by the evolution of Physics-Informed Neural Networks (PINNs). However, still, there are many complications in the training of PINNs and optimal interleaving of physical models. Here, we introduced the concept of Physical Activation Functions (PAFs). This concept offers that instead of using general activation functions (AFs) such as ReLU, tanh, and sigmoid for all the neurons, one can use generic AFs that their mathematical expression is inherited from the physical laws of the investigating phenomena. The formula of PAFs may be inspired by the terms in the analytical solution of the problem. We showed that the PAFs can be inspired by any mathematical formula related to the investigating phenomena such as the initial or boundary conditions of the PDE system. We validated the advantages of PAFs for several PDEs including the harmonic oscillations, Burgers, Advection-Convection equation, and the heterogeneous diffusion equations. The main advantage of PAFs was in the more efficient constraining and interleaving of PINNs with the investigating physical phenomena and their underlying mathematical models. This added constraint significantly improved the predictions of PINNs for the testing data that was out-of-training distribution. Furthermore, the application of PAFs reduced the size of the PINNs up to 75 loss terms was reduced by 1 to 2 orders of magnitude in some cases which is noteworthy for upgrading the training of the PINNs. The iterations required for finding the optimum values were also significantly reduced. It is concluded that using the PAFs helps in generating PINNs with less complexity and much more validity for longer ranges of prediction.

READ FULL TEXT

page 14

page 17

page 19

page 20

research
12/01/2022

On the Compatibility between a Neural Network and a Partial Differential Equation for Physics-informed Learning

We shed light on a pitfall and an opportunity in physics-informed neural...
research
09/06/2022

How important are activation functions in regression and classification? A survey, performance comparison, and future directions

Inspired by biological neurons, the activation functions play an essenti...
research
09/24/2019

An Iterative Scientific Machine Learning Approach for Discovery of Theories Underlying Physical Phenomena

Form a pure mathematical point of view, common functional forms represen...
research
12/17/2022

Physics-informed Neural Networks with Periodic Activation Functions for Solute Transport in Heterogeneous Porous Media

Solute transport in porous media is relevant to a wide range of applicat...
research
11/17/2022

Multilayer Perceptron-based Surrogate Models for Finite Element Analysis

Many Partial Differential Equations (PDEs) do not have analytical soluti...
research
11/21/2017

Deep Learning for Physical Processes: Incorporating Prior Scientific Knowledge

We consider the use of Deep Learning methods for modeling complex phenom...
research
07/05/2022

A Deep Learning Approach for the solution of Probability Density Evolution of Stochastic Systems

Derivation of the probability density evolution provides invaluable insi...

Please sign up or login with your details

Forgot password? Click here to reset