Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation

10/22/2021
by   Jingyu Zhao, et al.
0

This paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image classification, object detection and instance segmentation tasks. Specifically, improvements can be uniformly observed on CIFAR, ImageNet and MS COCO datasets, and the corresponding RLA-Nets can surprisingly boost the performances by 2-3 detection task. This evidences the power of our RLA module in helping main CNNs better learn structural information in images.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

Cross-Layer Retrospective Retrieving via Layer Attention

More and more evidence has shown that strengthening layer interactions c...
research
08/18/2021

An Attention Module for Convolutional Neural Networks

Attention mechanism has been regarded as an advanced technique to captur...
research
09/25/2021

TreeNet: A lightweight One-Shot Aggregation Convolutional Network

The architecture of deep convolutional networks (CNNs) has evolved for y...
research
03/16/2020

SlimConv: Reducing Channel Redundancy in Convolutional Neural Networks by Weights Flipping

The channel redundancy in feature maps of convolutional neural networks ...
research
09/25/2019

Gated Channel Transformation for Visual Recognition

In this work, we propose a generally applicable transformation unit for ...
research
09/25/2020

Tied Block Convolution: Leaner and Better CNNs with Shared Thinner Filters

Convolution is the main building block of convolutional neural networks ...
research
07/09/2021

Understanding the Distributions of Aggregation Layers in Deep Neural Networks

The process of aggregation is ubiquitous in almost all deep nets models....

Please sign up or login with your details

Forgot password? Click here to reset