A Hybrid Quantum-Classical Approach based on the Hadamard Transform for the Convolutional Layer

05/27/2023
by   Hongyi Pan, et al.
0

In this paper, we propose a novel Hadamard Transform (HT)-based neural network layer for hybrid quantum-classical computing. It implements the regular convolutional layers in the Hadamard transform domain. The idea is based on the HT convolution theorem which states that the dyadic convolution between two vectors is equivalent to the element-wise multiplication of their HT representation. Computing the HT is simply the application of a Hadamard gate to each qubit individually, so the HT computations of our proposed layer can be implemented on a quantum computer. Compared to the regular Conv2D layer, the proposed HT-perceptron layer is computationally more efficient. Compared to a CNN with the same number of trainable parameters and 99.26% test accuracy, our HT network reaches 99.31% test accuracy with 57.1% MACs reduced in the MNIST dataset; and in our ImageNet-1K experiments, our HT-based ResNet-50 exceeds the accuracy of the baseline ResNet-50 by 0.59% center-crop top-1 accuracy using 11.5% fewer parameters with 12.6% fewer MACs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2022

DCT Perceptron Layer: A Transform Domain Approach for Convolution Layer

In this paper, we propose a novel Discrete Cosine Transform (DCT)-based ...
research
03/13/2023

Orthogonal Transform Domain Approaches for the Convolutional Layer

In this paper, we propose a set of transform-based neural network layers...
research
01/07/2022

Block Walsh-Hadamard Transform Based Binary Layers in Deep Neural Networks

Convolution has been the core operation of modern deep neural networks. ...
research
07/15/2021

Recurrent Parameter Generators

We present a generic method for recurrently using the same parameters fo...
research
01/27/2023

Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks with Quantum Computation

Ridgelet transform has been a fundamental mathematical tool in the theor...
research
09/06/2018

ProdSumNet: reducing model parameters in deep neural networks via product-of-sums matrix decompositions

We consider a general framework for reducing the number of trainable mod...
research
11/05/2019

Bipolar Morphological Neural Networks: Convolution Without Multiplication

In the paper we introduce a novel bipolar morphological neuron and bipol...

Please sign up or login with your details

Forgot password? Click here to reset