Learning Versatile Convolution Filters for Efficient Visual Recognition

09/20/2021
by   Kai Han, et al.
0

This paper introduces versatile filters to construct efficient convolutional neural networks that are widely used in various visual recognition tasks. Considering the demands of efficient deep learning techniques running on cost-effective hardware, a number of methods have been developed to learn compact neural networks. Most of these works aim to slim down filters in different ways, , investigating small, sparse or quantized filters. In contrast, we treat filters from an additive perspective. A series of secondary filters can be derived from a primary filter with the help of binary masks. These secondary filters all inherit in the primary filter without occupying more storage, but once been unfolded in computation they could significantly enhance the capability of the filter by integrating information extracted from different receptive fields. Besides spatial versatile filters, we additionally investigate versatile filters from the channel perspective. Binary masks can be further customized for different primary filters under orthogonal constraints. We conduct theoretical analysis on network complexity and an efficient convolution scheme is introduced. Experimental results on benchmark datasets and neural networks demonstrate that our versatile filters are able to achieve comparable accuracy as that of original filters, but require less memory and computation cost.

READ FULL TEXT

page 11

page 12

research
08/06/2019

Full-Stack Filters to Build Minimum Viable CNNs

Deep convolutional neural networks (CNNs) are usually over-parameterized...
research
10/06/2020

Compressing Deep Convolutional Neural Networks by Stacking Low-dimensional Binary Convolution Filters

Deep Convolutional Neural Networks (CNN) have been successfully applied ...
research
02/02/2021

Orientation Convolutional Networks for Image Recognition

Deep Convolutional Neural Networks (DCNNs) are capable of obtaining powe...
research
08/17/2021

Adaptive Convolutions with Per-pixel Dynamic Filter Atom

Applying feature dependent network weights have been proved to be effect...
research
10/28/2018

Distilling Critical Paths in Convolutional Neural Networks

Neural network compression and acceleration are widely demanded currentl...
research
12/31/2019

AdderNet: Do We Really Need Multiplications in Deep Learning?

Compared with cheap addition operation, multiplication operation is of m...
research
11/26/2018

Leveraging Filter Correlations for Deep Model Compression

We present a filter correlation based model compression approach for dee...

Please sign up or login with your details

Forgot password? Click here to reset