Block Walsh-Hadamard Transform Based Binary Layers in Deep Neural Networks

01/07/2022
by   Hongyi Pan, et al.
0

Convolution has been the core operation of modern deep neural networks. It is well-known that convolutions can be implemented in the Fourier Transform domain. In this paper, we propose to use binary block Walsh-Hadamard transform (WHT) instead of the Fourier transform. We use WHT-based binary layers to replace some of the regular convolution layers in deep neural networks. We utilize both one-dimensional (1-D) and two-dimensional (2-D) binary WHTs in this paper. In both 1-D and 2-D layers, we compute the binary WHT of the input feature map and denoise the WHT domain coefficients using a nonlinearity which is obtained by combining soft-thresholding with the tanh function. After denoising, we compute the inverse WHT. We use 1D-WHT to replace the 1× 1 convolutional layers, and 2D-WHT layers can replace the 3×3 convolution layers and Squeeze-and-Excite layers. 2D-WHT layers with trainable weights can be also inserted before the Global Average Pooling (GAP) layers to assist the dense layers. In this way, we can reduce the number of trainable parameters significantly with a slight decrease in trainable parameters. In this paper, we implement the WHT layers into MobileNet-V2, MobileNet-V3-Large, and ResNet to reduce the number of parameters significantly with negligible accuracy loss. Moreover, according to our speed test, the 2D-FWHT layer runs about 24 times as fast as the regular 3× 3 convolution with 19.51% less RAM usage in an NVIDIA Jetson Nano experiment.

READ FULL TEXT
research
04/14/2021

Fast Walsh-Hadamard Transform and Smooth-Thresholding Based Binary Layers in Deep Neural Networks

In this paper, we propose a novel layer based on fast Walsh-Hadamard tra...
research
03/22/2016

Convolution in Convolution for Network in Network

Network in Netwrok (NiN) is an effective instance and an important exten...
research
03/13/2023

Orthogonal Transform Domain Approaches for the Convolutional Layer

In this paper, we propose a set of transform-based neural network layers...
research
10/11/2020

Efficient Long-Range Convolutions for Point Clouds

The efficient treatment of long-range interactions for point clouds is a...
research
05/27/2023

A Hybrid Quantum-Classical Approach based on the Hadamard Transform for the Convolutional Layer

In this paper, we propose a novel Hadamard Transform (HT)-based neural n...
research
09/07/2018

Accelerating Deep Neural Networks with Spatial Bottleneck Modules

This paper presents an efficient module named spatial bottleneck for acc...
research
09/06/2018

ProdSumNet: reducing model parameters in deep neural networks via product-of-sums matrix decompositions

We consider a general framework for reducing the number of trainable mod...

Please sign up or login with your details

Forgot password? Click here to reset