Quantum Self-Attention Neural Networks for Text Classification

05/11/2022
by   Guangxi Li, et al.
0

An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in Quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can make up for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2022

QSAN: A Near-term Achievable Quantum Self-Attention Network

Self-Attention Mechanism (SAM), an important component of machine learni...
research
11/24/2022

A Self-Attention Ansatz for Ab-initio Quantum Chemistry

We present a novel neural network architecture using self-attention, the...
research
08/25/2023

QKSAN: A Quantum Kernel Self-Attention Network

Self-Attention Mechanism (SAM) is skilled at extracting important inform...
research
08/21/2023

Evaluating quantum generative models via imbalanced data classification benchmarks

A limited set of tools exist for assessing whether the behavior of quant...
research
10/31/2022

QNet: A Quantum-native Sequence Encoder Architecture

This work investigates how current quantum computers can improve the per...
research
10/24/2022

Composition, Attention, or Both?

In this paper, we propose a novel architecture called Composition Attent...
research
05/12/2020

AttViz: Online exploration of self-attention for transparent neural language modeling

Neural language models are becoming the prevailing methodology for the t...

Please sign up or login with your details

Forgot password? Click here to reset