QSAN: A Near-term Achievable Quantum Self-Attention Network

07/14/2022
by   Ren-xin Zhao, et al.
0

Self-Attention Mechanism (SAM), an important component of machine learning, has been relatively little investigated in the field of quantum machine learning. Inspired by the Variational Quantum Algorithm (VQA) framework and SAM, Quantum Self-Attention Network (QSAN) that can be implemented on a near-term quantum computer is proposed.Theoretically, Quantum Self-Attention Mechanism (QSAM), a novel interpretation of SAM with linearization and logicalization is defined, in which Quantum Logical Similarity (QLS) is presented firstly to impel a better execution of QSAM on quantum computers since inner product operations are replaced with logical operations, and then a QLS-based density matrix named Quantum Bit Self-Attention Score Matrix (QBSASM) is deduced for representing the output distribution effectively. Moreover, QSAN is implemented based on the QSAM framework and its practical quantum circuit is designed with 5 modules. Finally, QSAN is tested on a quantum computer with a small sample of data. The experimental results show that QSAN can converge faster in the quantum natural gradient descent framework and reassign weights to word vectors. The above illustrates that QSAN is able to provide attention with quantum characteristics faster, laying the foundation for Quantum Natural Language Processing (QNLP).

READ FULL TEXT

page 5

page 8

research
08/25/2023

QKSAN: A Quantum Kernel Self-Attention Network

Self-Attention Mechanism (SAM) is skilled at extracting important inform...
research
05/11/2022

Quantum Self-Attention Neural Networks for Text Classification

An emerging direction of quantum computing is to establish meaningful qu...
research
10/31/2022

QNet: A Quantum-native Sequence Encoder Architecture

This work investigates how current quantum computers can improve the per...
research
06/22/2020

Attention-based Quantum Tomography

With rapid progress across platforms for quantum systems, the problem of...
research
04/10/2021

Not All Attention Is All You Need

Self-attention based models have achieved remarkable success in natural ...
research
08/24/2023

Easy attention: A simple self-attention mechanism for Transformers

To improve the robustness of transformer neural networks used for tempor...
research
12/02/2020

Parallel Scheduling Self-attention Mechanism: Generalization and Optimization

Over the past few years, self-attention is shining in the field of deep ...

Please sign up or login with your details

Forgot password? Click here to reset