DyNet: Dynamic Convolution for Accelerating Convolutional Neural Networks

04/22/2020
by   Yikang Zhang, et al.
6

Convolution operator is the core of convolutional neural networks (CNNs) and occupies the most computation cost. To make CNNs more efficient, many methods have been proposed to either design lightweight networks or compress models. Although some efficient network structures have been proposed, such as MobileNet or ShuffleNet, we find that there still exists redundant information between convolution kernels. To address this issue, we propose a novel dynamic convolution method to adaptively generate convolution kernels based on image contents. To demonstrate the effectiveness, we apply dynamic convolution on multiple state-of-the-art CNNs. On one hand, we can reduce the computation cost remarkably while maintaining the performance. For ShuffleNetV2/MobileNetV2/ResNet18/ResNet50, DyNet can reduce 37.0/54.7/67.2/71.3 performance can be largely boosted if the computation cost is maintained. Based on the architecture MobileNetV3-Small/Large, DyNet achieves 70.3/77.1 accuracy on ImageNet with an improvement of 2.9/1.9 scalability, we also apply DyNet on segmentation task, the results show that DyNet can reduce 69.3

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2019

Dynamic Convolution: Attention over Convolution Kernels

Light-weight convolutional neural networks (CNNs) suffer performance deg...
research
07/21/2022

Efficient CNN Architecture Design Guided by Visualization

Modern efficient Convolutional Neural Networks(CNNs) always use Depthwis...
research
06/19/2021

CompConv: A Compact Convolution Module for Efficient Feature Learning

Convolutional Neural Networks (CNNs) have achieved remarkable success in...
research
11/11/2022

Dual Complementary Dynamic Convolution for Image Recognition

As a powerful engine, vanilla convolution has promoted huge breakthrough...
research
02/26/2019

Recurrent Convolution for Compact and Cost-Adjustable Neural Networks: An Empirical Study

Recurrent convolution (RC) shares the same convolutional kernels and unr...
research
01/04/2021

DSXplore: Optimizing Convolutional Neural Networks via Sliding-Channel Convolutions

As the key advancement of the convolutional neural networks (CNNs), dept...
research
03/11/2016

Efficient forward propagation of time-sequences in convolutional neural networks using Deep Shifting

When a Convolutional Neural Network is used for on-the-fly evaluation of...

Please sign up or login with your details

Forgot password? Click here to reset