QNet: A Quantum-native Sequence Encoder Architecture

10/31/2022
by   Wei Day, et al.
0

This work investigates how current quantum computers can improve the performance of natural language processing tasks. To achieve this goal, we proposed QNet, a novel sequence encoder model entirely inferences on the quantum computer using a minimum number of qubits. QNet is inspired by Transformer, the state-of-the-art neural network model based on the attention mechanism to relate the tokens. While the attention mechanism requires time complexity of O(n^2 · d) to perform matrix multiplication operations, QNet has merely O(n+d) quantum circuit depth, where n and d represent the length of the sequence and the embedding size, respectively. To employ QNet on the NISQ devices, ResQNet, a quantum-classical hybrid model composed of several QNet blocks linked by residual connections, is introduced. We evaluate ResQNet on various natural language processing tasks, including text classification, rating score prediction, and named entity recognition. ResQNet exhibits a 6 818 using the exact embedding dimensions. In summary, this work demonstrates the advantage of quantum computing in natural language processing tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2020

Attention-based Quantum Tomography

With rapid progress across platforms for quantum systems, the problem of...
research
07/14/2022

QSAN: A Near-term Achievable Quantum Self-Attention Network

Self-Attention Mechanism (SAM), an important component of machine learni...
research
08/04/2016

Quantum Algorithms for Compositional Natural Language Processing

We propose a new application of quantum computing to the field of natura...
research
05/11/2022

Quantum Self-Attention Neural Networks for Text Classification

An emerging direction of quantum computing is to establish meaningful qu...
research
07/16/2023

Fast Quantum Algorithm for Attention Computation

Large language models (LLMs) have demonstrated exceptional performance a...
research
08/22/2023

Uncertainty Estimation of Transformers' Predictions via Topological Analysis of the Attention Matrices

Determining the degree of confidence of deep learning model in its predi...
research
08/30/2018

Iterative Recursive Attention Model for Interpretable Sequence Classification

Natural language processing has greatly benefited from the introduction ...

Please sign up or login with your details

Forgot password? Click here to reset