Log-DenseNet: How to Sparsify a DenseNet

10/30/2017
by   Hanzhang Hu, et al.
0

Skip connections are increasingly utilized by deep neural networks to improve accuracy and cost-efficiency. In particular, the recent DenseNet is efficient in computation and parameters, and achieves state-of-the-art predictions by directly connecting each feature layer to all previous ones. However, DenseNet's extreme connectivity pattern may hinder its scalability to high depths, and in applications like fully convolutional networks, full DenseNet connections are prohibitively expensive. This work first experimentally shows that one key advantage of skip connections is to have short distances among feature layers during backpropagation. Specifically, using a fixed number of skip connections, the connection patterns with shorter backpropagation distance among layers have more accurate predictions. Following this insight, we propose a connection template, Log-DenseNet, which, in comparison to DenseNet, only slightly increases the backpropagation distances among layers from 1 to (1 + _2 L), but uses only L_2 L total connections instead of O(L^2). Hence, Log-DenseNets are easier than DenseNets to implement and to scale. We demonstrate the effectiveness of our design principle by showing better performance than DenseNets on tabula rasa semantic segmentation, and competitive results on visual recognition.

READ FULL TEXT

page 7

page 16

research
10/23/2017

Investigating the feature collection for semantic segmentation via single skip connection

Since the study of deep convolutional neural network became prevalent, o...
research
06/01/2013

An Analysis of the Connections Between Layers of Deep Neural Networks

We present an analysis of different techniques for selecting the connect...
research
07/20/2018

Competition vs. Concatenation in Skip Connections of Fully Convolutional Networks

Increased information sharing through short and long-range skip connecti...
research
01/18/2018

Sparsely Connected Convolutional Networks

Residual learning with skip connections permits training ultra-deep neur...
research
11/02/2022

Backdoor Defense via Suppressing Model Shortcuts

Recent studies have demonstrated that deep neural networks (DNNs) are vu...
research
07/18/2017

Beyond Forward Shortcuts: Fully Convolutional Master-Slave Networks (MSNets) with Backward Skip Connections for Semantic Segmentation

Recent deep CNNs contain forward shortcut connections; i.e. skip connect...
research
09/14/2022

Optimal Connectivity through Network Gradients for the Restricted Boltzmann Machine

Leveraging sparse networks to connect successive layers in deep neural n...

Please sign up or login with your details

Forgot password? Click here to reset