Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations

10/27/2017
by   Yiping Lu, et al.
0

In our work, we bridge deep neural network design with numerical differential equations. We show that many effective networks, such as ResNet, PolyNet, FractalNet and RevNet, can be interpreted as different numerical discretizations of differential equations. This finding brings us a brand new perspective on the design of effective deep architectures. We can take advantage of the rich knowledge in numerical analysis to guide us in designing new and potentially more effective deep networks. As an example, we propose a linear multi-step architecture (LM-architecture) which is inspired by the linear multi-step method solving ordinary differential equations. The LM-architecture is an effective structure that can be used on any ResNet-like networks. In particular, we demonstrate that LM-ResNet and LM-ResNeXt (i.e. the networks obtained by applying the LM-architecture on ResNet and ResNeXt respectively) can achieve noticeably higher accuracy than ResNet and ResNeXt on both CIFAR and ImageNet with comparable numbers of trainable parameters. In particular, on both CIFAR and ImageNet, LM-ResNet/LM-ResNeXt can significantly compress (>50%) the original networks while maintaining a similar performance. This can be explained mathematically using the concept of modified equation from numerical analysis. Last but not least, we also establish a connection between stochastic control and noise injection in the training process which helps to improve generalization of the networks. Furthermore, by relating stochastic training strategy with stochastic dynamic system, we can easily apply stochastic training to the networks with the LM-architecture. As an example, we introduced stochastic depth to LM-ResNet and achieve significant improvement over the original LM-ResNet on CIFAR10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/10/2021

Accuracy and Architecture Studies of Residual Neural Network solving Ordinary Differential Equations

In this paper we consider utilizing a residual neural network (ResNet) t...
research
12/01/2018

Stochastic Training of Residual Networks: a Differential Equation Viewpoint

During the last few years, significant attention has been paid to the st...
research
08/21/2017

A Flow Model of Neural Networks

Based on a natural connection between ResNet and transport equation or i...
research
08/18/2021

Analyze and Design Network Architectures by Recursion Formulas

The effectiveness of shortcut/skip-connection has been widely verified, ...
research
06/29/2023

Designing Stable Neural Networks using Convex Analysis and ODEs

Motivated by classical work on the numerical integration of ordinary dif...
research
03/28/2021

Rethinking ResNets: Improved Stacking Strategies With High Order Schemes

Various Deep Neural Network architectures are keeping massive vital reco...
research
05/24/2018

Residual Networks as Geodesic Flows of Diffeomorphisms

This paper addresses the understanding and characterization of residual ...

Please sign up or login with your details

Forgot password? Click here to reset