Sparsely Connected Convolutional Networks

01/18/2018
by   Ligeng Zhu, et al.
0

Residual learning with skip connections permits training ultra-deep neural networks and obtains superb performance. Building in this direction, DenseNets proposed a dense connection structure where each layer is directly connected to all of its predecessors. The densely connected structure leads to better information flow and feature reuse. However, the overly dense skip connections also bring about the problems of potential risk of overfitting, parameter redundancy and large memory consumption. In this work, we analyze the feature aggregation patterns of ResNets and DenseNets under a uniform aggregation view framework. We show that both structures densely gather features from previous layers in the network but combine them in their respective ways: summation (ResNets) or concatenation (DenseNets). We compare the strengths and drawbacks of these two aggregation methods and analyze their potential effects on the networks' performance. Based on our analysis, we propose a new structure named SparseNets which achieves better performance with fewer parameters than DenseNets and ResNets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/02/2022

Connection Reduction Is All You Need

Convolutional Neural Networks (CNN) increase depth by stacking convoluti...
research
10/02/2018

Multi-scale Convolution Aggregation and Stochastic Feature Reuse for DenseNets

Recently, Convolution Neural Networks (CNNs) obtained huge success in nu...
research
07/29/2021

Densely connected neural networks for nonlinear regression

Densely connected convolutional networks (DenseNet) behave well in image...
research
07/19/2017

Improving Language Modeling using Densely Connected Recurrent Neural Networks

In this paper, we introduce the novel concept of densely connected layer...
research
08/05/2020

Densely Connected Residual Network for Attack Recognition

High false alarm rate and low detection rate are the major sticking poin...
research
10/30/2017

Log-DenseNet: How to Sparsify a DenseNet

Skip connections are increasingly utilized by deep neural networks to im...
research
06/05/2018

Exploring Feature Reuse in DenseNet Architectures

Densely Connected Convolutional Networks (DenseNets) have been shown to ...

Please sign up or login with your details

Forgot password? Click here to reset