QKSAN: A Quantum Kernel Self-Attention Network

08/25/2023
by   Ren-xin Zhao, et al.
0

Self-Attention Mechanism (SAM) is skilled at extracting important information from the interior of data to improve the computational efficiency of models. Nevertheless, many Quantum Machine Learning (QML) models lack the ability to distinguish the intrinsic connections of information like SAM, which limits their effectiveness on massive high-dimensional quantum data. To address this issue, a Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced, which combines the data representation benefit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM. A Quantum Kernel Self-Attention Network (QKSAN) framework is built based on QKSAM, with Deferred Measurement Principle (DMP) and conditional measurement techniques, which releases half of the quantum resources with probabilistic measurements during computation. The Quantum Kernel Self-Attention Score (QKSAS) determines the measurement conditions and reflects the probabilistic nature of quantum systems. Finally, four QKSAN models are deployed on the Pennylane platform to perform binary classification on MNIST images. The best-performing among the four models is assessed for noise immunity and learning ability. Remarkably, the potential learning benefit of partial QKSAN models over classical deep learning is that they require few parameters for a high return of 98% ± 1% test and train accuracy, even with highly compressed images. QKSAN lays the foundation for future quantum computers to perform machine learning on massive amounts of data, while driving advances in areas such as quantum Natural Language Processing (NLP).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2022

QSAN: A Near-term Achievable Quantum Self-Attention Network

Self-Attention Mechanism (SAM), an important component of machine learni...
research
05/11/2022

Quantum Self-Attention Neural Networks for Text Classification

An emerging direction of quantum computing is to establish meaningful qu...
research
11/01/2018

Hybrid Self-Attention Network for Machine Translation

The encoder-decoder is the typical framework for Neural Machine Translat...
research
11/24/2022

A Self-Attention Ansatz for Ab-initio Quantum Chemistry

We present a novel neural network architecture using self-attention, the...
research
02/05/2023

On Robust Numerical Solver for ODE via Self-Attention Mechanism

With the development of deep learning techniques, AI-enhanced numerical ...
research
06/12/2023

Recurrent Attention Networks for Long-text Modeling

Self-attention-based models have achieved remarkable progress in short-t...
research
05/31/2023

Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation

Recently, a new line of works has emerged to understand and improve self...

Please sign up or login with your details

Forgot password? Click here to reset