Locally adaptive activation functions with slope recovery term for deep and physics-informed neural networks

09/25/2019
by   Ameya D. Jagtap, et al.
18

We propose two approaches of locally adaptive activation functions namely, layer-wise and neuron-wise locally adaptive activation functions, which improve the performance of deep and physics-informed neural networks. The local adaptation of activation function is achieved by introducing scalable hyper-parameters in each layer (layer-wise) and for every neuron separately (neuron-wise), and then optimizing it using the stochastic gradient descent algorithm. Introduction of neuron-wise activation function acts like a vector activation function as opposed to the traditional scalar activation function given by fixed, global and layer-wise activations. In order to further increase the training speed, an activation slope based slope recovery term is added in the loss function, which further accelerate convergence, thereby reducing the training cost. For numerical experiments, a nonlinear discontinuous function is approximated using a deep neural network with layer-wise and neuron-wise locally adaptive activation functions with and without the slope recovery term and compared with its global counterpart. Moreover, solution of the nonlinear Burgers equation, which exhibits steep gradients, is also obtained using the proposed methods. On the theoretical side, we prove that in the proposed method the gradient descent algorithms are not attracted to sub-optimal critical points or local minima under practical conditions on the initialization and learning rate. Furthermore, the proposed adaptive activation functions with the slope recovery are shown to accelerate the training process in standard deep learning benchmarks using CIFAR-10, CIFAR-100, SVHN, MNIST, KMNIST, Fashion-MNIST, and Semeion data sets with and without data augmentation.

READ FULL TEXT
research
12/21/2014

Learning Activation Functions to Improve Deep Neural Networks

Artificial neural networks typically have a fixed, non-linear activation...
research
11/05/2018

Lifted Proximal Operator Machines

We propose a new optimization method for training feed-forward neural ne...
research
12/06/2021

Data-driven forward-inverse problems for Yajima-Oikawa system using deep learning with parameter regularization

We investigate data-driven forward-inverse problems for Yajima-Oikawa (Y...
research
03/21/2023

Adaptive quadratures for nonlinear approximation of low-dimensional PDEs using smooth neural networks

Physics-informed neural networks (PINNs) and their variants have recentl...
research
04/26/2022

Self-scalable Tanh (Stan): Faster Convergence and Better Generalization in Physics-informed Neural Networks

Physics-informed Neural Networks (PINNs) are gaining attention in the en...
research
10/15/2019

The Local Elasticity of Neural Networks

This paper presents a phenomenon in neural networks that we refer to as ...
research
05/07/2020

Physics-informed neural network for ultrasound nondestructive quantification of surface breaking cracks

We introduce an optimized physics-informed neural network (PINN) trained...

Please sign up or login with your details

Forgot password? Click here to reset