Deep neural networks for smooth approximation of physics with higher order and continuity B-spline base functions

01/03/2022
by   Kamil Doległo, et al.
60

This paper deals with the following important research question. Traditionally, the neural network employs non-linear activation functions concatenated with linear operators to approximate a given physical phenomenon. They "fill the space" with the concatenations of the activation functions and linear operators and adjust their coefficients to approximate the physical phenomena. We claim that it is better to "fill the space" with linear combinations of smooth higher-order B-splines base functions as employed by isogeometric analysis and utilize the neural networks to adjust the coefficients of linear combinations. In other words, the possibilities of using neural networks for approximating the B-spline base functions' coefficients and by approximating the solution directly are evaluated. Solving differential equations with neural networks has been proposed by Maziar Raissi et al. in 2017 by introducing Physics-informed Neural Networks (PINN), which naturally encode underlying physical laws as prior information. Approximation of coefficients using a function as an input leverages the well-known capability of neural networks being universal function approximators. In essence, in the PINN approach the network approximates the value of the given field at a given point. We present an alternative approach, where the physcial quantity is approximated as a linear combination of smooth B-spline basis functions, and the neural network approximates the coefficients of B-splines. This research compares results from the DNN approximating the coefficients of the linear combination of B-spline basis functions, with the DNN approximating the solution directly. We show that our approach is cheaper and more accurate when approximating smooth physical fields.

READ FULL TEXT

page 6

page 9

page 12

page 13

page 17

page 20

research
10/24/2020

Deep neural network for solving differential equations motivated by Legendre-Galerkin approximation

Nonlinear differential equations are challenging to solve numerically an...
research
06/23/2019

Learning Activation Functions: A new paradigm of understanding Neural Networks

There has been limited research in the domain of activation functions, m...
research
06/30/2020

Approximation Rates for Neural Networks with Encodable Weights in Smoothness Spaces

We examine the necessary and sufficient complexity of neural networks to...
research
08/08/2023

Learning Specialized Activation Functions for Physics-informed Neural Networks

Physics-informed neural networks (PINNs) are known to suffer from optimi...
research
09/26/2022

A connection between probability, physics and neural networks

We illustrate an approach that can be exploited for constructing neural ...
research
07/27/2019

Deep Neural Network Approach to Forward-Inverse Problems

In this paper, we construct approximated solutions of Differential Equat...
research
08/14/2021

Optimal Approximation with Sparse Neural Networks and Applications

We use deep sparsely connected neural networks to measure the complexity...

Please sign up or login with your details

Forgot password? Click here to reset