Densely Connected Convolutional Networks

08/25/2016
by   Gao Huang, et al.
0

Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion. Whereas traditional convolutional networks with L layers have L connections - one between each layer and its subsequent layer - our network has L(L+1)/2 direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant improvements over the state-of-the-art on most of them, whilst requiring less memory and computation to achieve high performance. Code and models are available at https://github.com/liuzhuang13/DenseNet .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/08/2020

Convolutional Networks with Dense Connectivity

Recent work has shown that convolutional networks can be substantially d...
research
04/20/2019

ChoiceNet: CNN learning through choice of multiple feature map representations

We introduce a new architecture called ChoiceNet where each layer of the...
research
10/03/2022

Multipod Convolutional Network

In this paper, we introduce a convolutional network which we call MultiP...
research
08/24/2018

ParaNet - Using Dense Blocks for Early Inference

DenseNets have been shown to be a competitive model among recent convolu...
research
04/03/2017

Truncating Wide Networks using Binary Tree Architectures

Recent study shows that a wide deep network can obtain accuracy comparab...
research
06/05/2018

Exploring Feature Reuse in DenseNet Architectures

Densely Connected Convolutional Networks (DenseNets) have been shown to ...
research
06/05/2017

Submanifold Sparse Convolutional Networks

Convolutional network are the de-facto standard for analysing spatio-tem...

Please sign up or login with your details

Forgot password? Click here to reset