Optimization Algorithm Inspired Deep Neural Network Structure Design

10/03/2018
by   Huan Li, et al.
0

Deep neural networks have been one of the dominant machine learning approaches in recent years. Several new network structures are proposed and have better performance than the traditional feedforward neural network structure. Representative ones include the skip connection structure in ResNet and the dense connection structure in DenseNet. However, it still lacks a unified guidance for the neural network structure design. In this paper, we propose the hypothesis that the neural network structure design can be inspired by optimization algorithms and a faster optimization algorithm may lead to a better neural network structure. Specifically, we prove that the propagation in the feedforward neural network with the same linear transformation in different layers is equivalent to minimizing some function using the gradient descent algorithm. Based on this observation, we replace the gradient descent algorithm with the heavy ball algorithm and Nesterov's accelerated gradient descent algorithm, which are faster and inspire us to design new and better network structures. ResNet and DenseNet can be considered as two special cases of our framework. Numerical experiments on CIFAR-10, CIFAR-100 and ImageNet verify the advantage of our optimization algorithm inspired structures over ResNet and DenseNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2021

A Novel Structured Natural Gradient Descent for Deep Learning

Natural gradient descent (NGD) provided deep insights and powerful tools...
research
06/05/2018

EIGEN: Ecologically-Inspired GENetic Approach for Neural Network Structure Searching

Designing the structure of neural networks is considered one of the most...
research
06/20/2016

Neural networks with differentiable structure

While gradient descent has proven highly successful in learning connecti...
research
09/15/2020

Learning Functors using Gradient Descent

Neural networks are a general framework for differentiable optimization ...
research
11/05/2018

PILAE: A Non-gradient Descent Learning Scheme for Deep Feedforward Neural Networks

In this work, a non-gradient descent learning scheme is proposed for dee...
research
07/16/2019

Compositional Deep Learning

Neural networks have become an increasingly popular tool for solving man...
research
09/22/2022

Vanilla feedforward neural networks as a discretization of dynamic systems

Deep learning has made significant applications in the field of data sci...

Please sign up or login with your details

Forgot password? Click here to reset