Residual-based attention and connection to information bottleneck theory in PINNs

Driven by the need for more efficient and seamless integration of physical models and data, physics-informed neural networks (PINNs) have seen a surge of interest in recent years. However, ensuring the reliability of their convergence and accuracy remains a challenge. In this work, we propose an efficient, gradient-less weighting scheme for PINNs, that accelerates the convergence of dynamic or static systems. This simple yet effective attention mechanism is a function of the evolving cumulative residuals and aims to make the optimizer aware of problematic regions at no extra computational cost or adversarial learning. We illustrate that this general method consistently achieves a relative L^2 error of the order of 10^-5 using standard optimizers on typical benchmark cases of the literature. Furthermore, by investigating the evolution of weights during training, we identify two distinct learning phases reminiscent of the fitting and diffusion phases proposed by the information bottleneck (IB) theory. Subsequent gradient analysis supports this hypothesis by aligning the transition from high to low signal-to-noise ratio (SNR) with the transition from fitting to diffusion regimes of the adopted weights. This novel correlation between PINNs and IB theory could open future possibilities for understanding the underlying mechanisms behind the training and stability of PINNs and, more broadly, of neural operators.

READ FULL TEXT

page 10

page 14

page 19

page 20

page 25

page 26

research
05/15/2021

Drill the Cork of Information Bottleneck by Inputting the Most Important Data

Deep learning has become the most powerful machine learning tool in the ...
research
03/16/2023

Efficient Diffusion Training via Min-SNR Weighting Strategy

Denoising diffusion models have been a mainstream approach for image gen...
research
11/09/2019

Information Bottleneck Methods on Convolutional Neural Networks

Recent year, many researches attempt to open the black box of deep neura...
research
02/16/2022

The learning phases in NN: From Fitting the Majority to Fitting a Few

The learning dynamics of deep neural networks are subject to controversy...
research
07/08/2022

Adaptive Self-supervision Algorithms for Physics-informed Neural Networks

Physics-informed neural networks (PINNs) incorporate physical knowledge ...
research
11/30/2017

Phase Transitions in Approximate Ranking

We study the problem of approximate ranking from observations of pairwis...
research
03/17/2023

On the momentum diffusion over multiphase surfaces with meshless methods

This work investigates the effects of the choice of momentum diffusion o...

Please sign up or login with your details

Forgot password? Click here to reset