Error convergence and engineering-guided hyperparameter search of PINNs: towards optimized I-FENN performance

03/03/2023
by   Panos Pantidis, et al.
0

In this paper, we aim at enhancing the performance of our proposed I-FENN approach by focusing on two crucial aspects of its PINN component: the error convergence analysis and the hyperparameter-performance relationship. By building on the I-FENN setup, our methodology relies on systematic engineering-oriented numerical analysis that is guided by the available mathematical theories on the topic. The objectivity of the characterization is achieved through a novel combination of performance metrics that asses the success of minimization of various error measures, the training efficiency through optimization process, and the training computational effort. In the first objective, we investigate in detail the convergence of the PINN training error and the global error against the network size and the training sample size. We demonstrate a consistent converging behavior of the two error types, which proves the conformance of the PINN setup and implementation to the available convergence theories. In the second objective, we aim to establish an a-priori knowledge of the hyperparameters which favor higher predictive accuracy, lower computational effort, and the least chances of arriving at trivial solutions. We show that shallow-and-wide networks tend to overestimate high frequencies of the strain field and they are computationally more demanding in the L-BFGS stage. On the other hand, deep-and-narrow PINNs yield higher errors; they are computationally slower during Adam optimization epochs, and they are more prone to training failure by arriving at trivial solutions. Our analysis leads to several outcomes that contribute to the better performance of I-FENN and fills a long-standing gap in the PINN literature with regards to the numerical convergence of the network errors. The proposed analysis method and conclusions can be directly extended to other ML applications in science and engineering.

READ FULL TEXT

page 14

page 18

page 19

page 25

page 27

page 32

page 35

research
12/23/2022

Exploring the Optimized Value of Each Hyperparameter in Various Gradient Descent Algorithms

In the recent years, various gradient descent algorithms including the m...
research
09/09/2022

Multi-objective hyperparameter optimization with performance uncertainty

The performance of any Machine Learning (ML) algorithm is impacted by th...
research
08/14/2020

Efficient hyperparameter optimization by way of PAC-Bayes bound minimization

Identifying optimal values for a high-dimensional set of hyperparameters...
research
02/15/2023

On the Hyperparameters influencing a PINN's generalization beyond the training domain

Physics-Informed Neural Networks (PINNs) are Neural Network architecture...
research
07/03/2022

Variational energy based XPINNs for phase field analysis in brittle fracture

Modeling fracture is computationally expensive even in computational sim...
research
04/22/2023

EEE, Remediating the failure of machine learning models via a network-based optimization patch

A network-based optimization approach, EEE, is proposed for the purpose ...

Please sign up or login with your details

Forgot password? Click here to reset