Multi-scale Convolution Aggregation and Stochastic Feature Reuse for DenseNets

10/02/2018
by   Mingjie Wang, et al.
0

Recently, Convolution Neural Networks (CNNs) obtained huge success in numerous vision tasks. In particular, DenseNets have demonstrated that feature reuse via dense skip connections can effectively alleviate the difficulty of training very deep networks and that reusing features generated by the initial layers in all subsequent layers has strong impact on performance. To feed even richer information into the network, a novel adaptive Multi-scale Convolution Aggregation module is presented in this paper. Composed of layers for multi-scale convolutions, trainable cross-scale aggregation, maxout, and concatenation, this module is highly non-linear and can boost the accuracy of DenseNet while using much fewer parameters. In addition, due to high model complexity, the network with extremely dense feature reuse is prone to overfitting. To address this problem, a regularization method named Stochastic Feature Reuse is also presented. Through randomly dropping a set of feature maps to be reused for each mini-batch during the training phase, this regularization method reduces training costs and prevents co-adaptation. Experimental results on CIFAR-10, CIFAR-100 and SVHN benchmarks demonstrated the effectiveness of the proposed methods.

READ FULL TEXT
research
01/18/2018

Sparsely Connected Convolutional Networks

Residual learning with skip connections permits training ultra-deep neur...
research
12/19/2021

Parallel Multi-Scale Networks with Deep Supervision for Hand Keypoint Detection

Keypoint detection plays an important role in a wide range of applicatio...
research
08/05/2020

OverNet: Lightweight Multi-Scale Super-Resolution with Overscaling Network

Super-resolution (SR) has achieved great success due to the development ...
research
11/18/2015

Competitive Multi-scale Convolution

In this paper, we introduce a new deep convolutional neural network (Con...
research
09/28/2018

Reconciling Feature-Reuse and Overfitting in DenseNet with Specialized Dropout

Recently convolutional neural networks (CNNs) achieve great accuracy in ...
research
11/11/2022

RepGhost: A Hardware-Efficient Ghost Module via Re-parameterization

Feature reuse has been a key technique in light-weight convolutional neu...
research
04/09/2021

CondenseNet V2: Sparse Feature Reactivation for Deep Networks

Reusing features in deep networks through dense connectivity is an effec...

Please sign up or login with your details

Forgot password? Click here to reset