An Optimal Time Variable Learning Framework for Deep Neural Networks

04/18/2022
by   Harbir Antil, et al.
0

Feature propagation in Deep Neural Networks (DNNs) can be associated to nonlinear discrete dynamical systems. The novelty, in this paper, lies in letting the discretization parameter (time step-size) vary from layer to layer, which needs to be learned, in an optimization framework. The proposed framework can be applied to any of the existing networks such as ResNet, DenseNet or Fractional-DNN. This framework is shown to help overcome the vanishing and exploding gradient issues. Stability of some of the existing continuous DNNs such as Fractional-DNN is also studied. The proposed approach is applied to an ill-posed 3D-Maxwell's equation.

READ FULL TEXT
research
04/01/2020

Fractional Deep Neural Network via Constrained Optimization

This paper introduces a novel algorithmic framework for a deep neural ne...
research
02/27/2018

How (Not) To Train Your Neural Network Using the Information Bottleneck Principle

In this theory paper, we investigate training deep neural networks (DNNs...
research
04/14/2021

Do Neural Network Weights account for Classes Centers?

The exploitation of Deep Neural Networks (DNNs) as descriptors in featur...
research
11/19/2020

Deep Learning with a Single Neuron: Folding a Deep Neural Network in Time using Feedback-Modulated Delay Loops

Deep neural networks are among the most widely applied machine learning ...
research
03/16/2021

Parareal Neural Networks Emulating a Parallel-in-time Algorithm

As deep neural networks (DNNs) become deeper, the training time increase...
research
09/27/2018

Smooth Inter-layer Propagation of Stabilized Neural Networks for Classification

Recent work has studied the reasons for the remarkable performance of de...
research
06/15/2021

Learning to Compensate: A Deep Neural Network Framework for 5G Power Amplifier Compensation

Owing to the complicated characteristics of 5G communication system, des...

Please sign up or login with your details

Forgot password? Click here to reset