DeepAI AI Chat
Log In Sign Up

Adaptive Two-Layer ReLU Neural Network: I. Best Least-squares Approximation

07/19/2021
by   Min Liu, et al.
Purdue University
0

In this paper, we introduce adaptive neuron enhancement (ANE) method for the best least-squares approximation using two-layer ReLU neural networks (NNs). For a given function f(x), the ANE method generates a two-layer ReLU NN and a numerical integration mesh such that the approximation accuracy is within the prescribed tolerance. The ANE method provides a natural process for obtaining a good initialization which is crucial for training nonlinear optimization problems. Numerical results of the ANE method are presented for functions of two variables exhibiting either intersecting interface singularities or sharp interior layers.

READ FULL TEXT
07/14/2021

Adaptive Two-Layer ReLU Neural Network: II. Ritz Approximation to Elliptic PDEs

In this paper, we study adaptive neuron enhancement (ANE) method for sol...
05/25/2021

Least-Squares ReLU Neural Network (LSNN) Method For Linear Advection-Reaction Equation

This paper studies least-squares ReLU neural network method for solving ...
09/07/2021

Self-adaptive deep neural network: Numerical approximation to functions and PDEs

Designing an optimal deep neural network for a given task is important a...
12/28/2020

Neural Network Approximation

Neural Networks (NNs) are the method of choice for building learning alg...
05/25/2021

Least-Squares ReLU Neural Network (LSNN) Method For Scalar Nonlinear Hyperbolic Conservation Law

We introduced the least-squares ReLU neural network (LSNN) method for so...
11/22/2022

A neuron-wise subspace correction method for the finite neuron method

In this paper, we propose a novel algorithm called Neuron-wise Parallel ...
12/17/2020

Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers

This paper presents a nonlinear model reduction method for systems of eq...