Tied Block Convolution: Leaner and Better CNNs with Shared Thinner Filters

09/25/2020
by   Xudong Wang, et al.
6

Convolution is the main building block of convolutional neural networks (CNN). We observe that an optimized CNN often has highly correlated filters as the number of channels increases with depth, reducing the expressive power of feature representations. We propose Tied Block Convolution (TBC) that shares the same thinner filters over equal blocks of channels and produces multiple responses with a single filter. The concept of TBC can also be extended to group convolution and fully connected layers, and can be applied to various backbone networks and attention modules. Our extensive experimentation on classification, detection, instance segmentation, and attention demonstrates TBC's significant across-the-board gain over standard convolution and group convolution. The proposed TiedSE attention module can even use 64 times fewer parameters than the SE module to achieve comparable performance. In particular, standard CNNs often fail to accurately aggregate information in the presence of occlusion and result in multiple redundant partial object proposals. By sharing filters across channels, TBC reduces correlation and can effectively handle highly overlapping instances. TBC increases the average precision for object detection on MS-COCO by 6 released.

READ FULL TEXT

page 2

page 6

page 7

page 10

page 12

research
03/11/2019

Accuracy Booster: Performance Boosting using Feature Map Re-calibration

Convolution Neural Networks (CNN) have been extremely successful in solv...
research
06/09/2020

Learning Shared Filter Bases for Efficient ConvNets

Modern convolutional neural networks (ConvNets) achieve state-of-the-art...
research
11/17/2020

Multigrid-in-Channels Neural Network Architectures

We present a multigrid-in-channels (MGIC) approach that tackles the quad...
research
02/26/2021

A Reconfigurable Winograd CNN Accelerator with Nesting Decomposition Algorithm for Computing Convolution with Large Filters

Recent literature found that convolutional neural networks (CNN) with la...
research
10/22/2021

Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation

This paper introduces a concept of layer aggregation to describe how inf...
research
07/18/2017

Transitioning between Convolutional and Fully Connected Layers in Neural Networks

Digital pathology has advanced substantially over the last decade howeve...
research
08/06/2019

Full-Stack Filters to Build Minimum Viable CNNs

Deep convolutional neural networks (CNNs) are usually over-parameterized...

Please sign up or login with your details

Forgot password? Click here to reset