Robust Training and Initialization of Deep Neural Networks: An Adaptive Basis Viewpoint

12/10/2019
by   Eric C. Cyr, et al.
0

Motivated by the gap between theoretical optimal approximation rates of deep neural networks (DNNs) and the accuracy realized in practice, we seek to improve the training of DNNs. The adoption of an adaptive basis viewpoint of DNNs leads to novel initializations and a hybrid least squares/gradient descent optimizer. We provide analysis of these techniques and illustrate via numerical examples dramatic increases in accuracy and convergence rate for benchmarks characterizing scientific applications where DNNs are currently used, including regression problems and physics-informed neural networks for the solution of partial differential equations.

READ FULL TEXT

page 14

page 26

research
07/27/2022

Sparse Deep Neural Network for Nonlinear Partial Differential Equations

More competent learning models are demanded for data processing due to i...
research
05/15/2023

Nearly Optimal VC-Dimension and Pseudo-Dimension Bounds for Deep Neural Network Derivatives

This paper addresses the problem of nearly optimal Vapnik–Chervonenkis d...
research
11/22/2021

Data Assimilation with Deep Neural Nets Informed by Nudging

The nudging data assimilation algorithm is a powerful tool used to forec...
research
10/17/2022

Asymptotic-Preserving Neural Networks for hyperbolic systems with diffusive scaling

With the rapid advance of Machine Learning techniques and the deep incre...
research
02/01/2018

Deep Learning with Data Dependent Implicit Activation Function

Though deep neural networks (DNNs) achieve remarkable performances in ma...
research
10/01/2022

Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition

This work analyzes the solution trajectory of gradient-based algorithms ...
research
03/15/2022

NINNs: Nudging Induced Neural Networks

New algorithms called nudging induced neural networks (NINNs), to contro...

Please sign up or login with your details

Forgot password? Click here to reset