DyNet: Dynamic Convolution for Accelerating Convolutional Neural Networks

04/22/2020
by   Yikang Zhang, et al.
6

Convolution operator is the core of convolutional neural networks (CNNs) and occupies the most computation cost. To make CNNs more efficient, many methods have been proposed to either design lightweight networks or compress models. Although some efficient network structures have been proposed, such as MobileNet or ShuffleNet, we find that there still exists redundant information between convolution kernels. To address this issue, we propose a novel dynamic convolution method to adaptively generate convolution kernels based on image contents. To demonstrate the effectiveness, we apply dynamic convolution on multiple state-of-the-art CNNs. On one hand, we can reduce the computation cost remarkably while maintaining the performance. For ShuffleNetV2/MobileNetV2/ResNet18/ResNet50, DyNet can reduce 37.0/54.7/67.2/71.3 performance can be largely boosted if the computation cost is maintained. Based on the architecture MobileNetV3-Small/Large, DyNet achieves 70.3/77.1 accuracy on ImageNet with an improvement of 2.9/1.9 scalability, we also apply DyNet on segmentation task, the results show that DyNet can reduce 69.3

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset