CircConv: A Structured Convolution with Low Complexity

02/28/2019
by   Siyu Liao, et al.
0

Deep neural networks (DNNs), especially deep convolutional neural networks (CNNs), have emerged as the powerful technique in various machine learning applications. However, the large model sizes of DNNs yield high demands on computation resource and weight storage, thereby limiting the practical deployment of DNNs. To overcome these limitations, this paper proposes to impose the circulant structure to the construction of convolutional layers, and hence leads to circulant convolutional layers (CircConvs) and circulant CNNs. The circulant structure and models can be either trained from scratch or re-trained from a pre-trained non-circulant model, thereby making it very flexible for different training environments. Through extensive experiments, such strong structure-imposing approach is proved to be able to substantially reduce the number of parameters of convolutional layers and enable significant saving of computational cost by using fast multiplication of the circulant tensor.

READ FULL TEXT
research
03/03/2021

An Alternative Practice of Tropical Convolution to Traditional Convolutional Neural Networks

Convolutional neural networks (CNNs) have been used in many machine lear...
research
09/25/2019

Information Plane Analysis of Deep Neural Networks via Matrix-Based Renyi's Entropy and Tensor Kernels

Analyzing deep neural networks (DNNs) via information plane (IP) theory ...
research
04/03/2019

Hybrid Cosine Based Convolutional Neural Networks

Convolutional neural networks (CNNs) have demonstrated their capability ...
research
10/10/2020

Block-term Tensor Neural Networks

Deep neural networks (DNNs) have achieved outstanding performance in a w...
research
06/10/2021

Transformed CNNs: recasting pre-trained convolutional layers with self-attention

Vision Transformers (ViT) have recently emerged as a powerful alternativ...
research
11/27/2020

A Study on the Uncertainty of Convolutional Layers in Deep Neural Networks

This paper shows a Min-Max property existing in the connection weights o...
research
10/08/2021

Lightweight Convolutional Neural Networks By Hypercomplex Parameterization

Hypercomplex neural networks have proved to reduce the overall number of...

Please sign up or login with your details

Forgot password? Click here to reset