Learning Specialized Activation Functions for Physics-informed Neural Networks

08/08/2023
by   Honghui Wang, et al.
0

Physics-informed neural networks (PINNs) are known to suffer from optimization difficulty. In this work, we reveal the connection between the optimization difficulty of PINNs and activation functions. Specifically, we show that PINNs exhibit high sensitivity to activation functions when solving PDEs with distinct properties. Existing works usually choose activation functions by inefficient trial-and-error. To avoid the inefficient manual selection and to alleviate the optimization difficulty of PINNs, we introduce adaptive activation functions to search for the optimal function when solving different problems. We compare different adaptive activation functions and discuss their limitations in the context of PINNs. Furthermore, we propose to tailor the idea of learning combinations of candidate activation functions to the PINNs optimization, which has a higher requirement for the smoothness and diversity on learned functions. This is achieved by removing activation functions which cannot provide higher-order derivatives from the candidate set and incorporating elementary functions with different properties according to our prior knowledge about the PDE at hand. We further enhance the search space with adaptive slopes. The proposed adaptive activation function can be used to solve different PDE systems in an interpretable way. Its effectiveness is demonstrated on a series of benchmarks. Code is available at https://github.com/LeapLabTHU/AdaAFforPINNs.

READ FULL TEXT

page 15

page 20

page 21

page 29

page 31

page 33

page 34

page 35

research
08/02/2018

The Quest for the Golden Activation Function

Deep Neural Networks have been shown to be beneficial for a variety of t...
research
09/06/2022

How important are activation functions in regression and classification? A survey, performance comparison, and future directions

Inspired by biological neurons, the activation functions play an essenti...
research
01/13/2023

Efficient Activation Function Optimization through Surrogate Modeling

Carefully designed activation functions can improve the performance of n...
research
12/17/2021

Adaptively Customizing Activation Functions for Various Layers

To enhance the nonlinearity of neural networks and increase their mappin...
research
06/08/2023

On the Identification and Optimization of Nonsmooth Superposition Operators in Semilinear Elliptic PDEs

We study an infinite-dimensional optimization problem that aims to ident...
research
01/03/2022

Deep neural networks for smooth approximation of physics with higher order and continuity B-spline base functions

This paper deals with the following important research question. Traditi...
research
05/31/2022

Optimal Activation Functions for the Random Features Regression Model

The asymptotic mean squared test error and sensitivity of the Random Fea...

Please sign up or login with your details

Forgot password? Click here to reset