ReLU Deep Neural Networks and linear Finite Elements

05/18/2021
by   Juncai He, et al.
0

In this paper, we investigate the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piece-wise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). We first consider the special case of FEM. By exploring the DNN representation of its nodal basis functions, we present a ReLU DNN representation of CP- WL in FEM. We theoretically establish that at least 2 hidden layers are needed in a ReLU DNN to represent any linear finite element functions in Ω ⊆ Rd when d ≥ 2. Consequently, for d = 2, 3 which are often encountered in scientific and engineering computing, the minimal number of two hidden layers are necessary and sufficient for any CPWL function to be represented by a ReLU DNN. Then we include a detailed account on how a general CPWL in Rd can be represented by a ReLU DNN with at most ⌈log2(d+1)⌉ hidden layers and we also give an estimation of the number of neurons in DNN that are needed in such a representation. Furthermore, using the relationship between DNN and FEM, we theoretically argue that a special class of DNN models with low bit-width are still expected to have an adequate representation power in applications. Finally, as a proof of concept, we present some numerical results for using ReLU DNNs to solve a two point boundary problem to demonstrate the potential of applying DNN for numerical solution of partial differential equations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2021

ReLU Deep Neural Networks from the Hierarchical Basis Perspective

We study ReLU deep neural networks (DNNs) by investigating their connect...
research
07/29/2022

Spline Representation and Redundancies of One-Dimensional ReLU Neural Network Models

We analyze the structure of a one-dimensional deep ReLU neural network (...
research
11/04/2016

Understanding Deep Neural Networks with Rectified Linear Units

In this paper we investigate the family of functions representable by de...
research
01/15/2023

Least-Squares Neural Network (LSNN) Method for Linear Advection-Reaction Equation: General Discontinuous Interface

We studied the least-squares ReLU neural network method (LSNN) for solvi...
research
02/23/2021

Deep ReLU Neural Network Approximation for Stochastic Differential Equations with Jumps

Deep neural networks (DNNs) with ReLU activation function are proved to ...
research
06/09/2020

In Proximity of ReLU DNN, PWA Function, and Explicit MPC

Rectifier (ReLU) deep neural networks (DNN) and their connection with pi...
research
03/10/2022

Stable Parametrization of Continuous and Piecewise-Linear Functions

Rectified-linear-unit (ReLU) neural networks, which play a prominent rol...

Please sign up or login with your details

Forgot password? Click here to reset