Diverse Branch Block: Building a Convolution as an Inception-like Unit

03/24/2021
by   Xiaohan Ding, et al.
0

We propose a universal building block of Convolutional Neural Network (ConvNet) to improve the performance without any inference-time costs. The block is named Diverse Branch Block (DBB), which enhances the representational capacity of a single convolution by combining diverse branches of different scales and complexities to enrich the feature space, including sequences of convolutions, multi-scale convolutions, and average pooling. After training, a DBB can be equivalently converted into a single conv layer for deployment. Unlike the advancements of novel ConvNet architectures, DBB complicates the training-time microstructure while maintaining the macro architecture, so that it can be used as a drop-in replacement for regular conv layers of any architecture. In this way, the model can be trained to reach a higher level of performance and then transformed into the original inference-time structure for inference. DBB improves ConvNets on image classification (up to 1.9 top-1 accuracy on ImageNet), object detection and semantic segmentation. The PyTorch code and models are released at https://github.com/DingXiaoH/DiverseBranchBlock.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/11/2021

RepVGG: Making VGG-style ConvNets Great Again

We present a simple but powerful architecture of convolutional neural ne...
research
10/15/2020

HS-ResNet: Hierarchical-Split Block on Convolutional Neural Network

This paper addresses representational block named Hierarchical-Split Blo...
research
02/06/2022

Hyper-Convolutions via Implicit Kernels for Medical Imaging

The convolutional neural network (CNN) is one of the most commonly used ...
research
07/22/2019

MixNet: Mixed Depthwise Convolutional Kernels

Depthwise convolution is becoming increasingly popular in modern efficie...
research
08/11/2019

ACNet: Strengthening the Kernel Skeletons for Powerful CNN via Asymmetric Convolution Blocks

As designing appropriate Convolutional Neural Network (CNN) architecture...
research
10/17/2022

Deformably-Scaled Transposed Convolution

Transposed convolution is crucial for generating high-resolution outputs...
research
04/20/2019

Data-Driven Neuron Allocation for Scale Aggregation Networks

Successful visual recognition networks benefit from aggregating informat...

Please sign up or login with your details

Forgot password? Click here to reset