Quantum Vision Transformers

09/16/2022
by   El Amine Cherrat, et al.
26

We design and analyse quantum transformers, extending the state-of-the-art classical transformer neural network architectures known to be very performant in natural language processing and image analysis. Building upon the previous work of parametrised quantum circuits for data loading and orthogonal neural layers, we introduce three quantum attention mechanisms, including a quantum transformer based on compound matrices. These quantum architectures can be built using shallow quantum circuits and can provide qualitatively different classification models. We performed extensive simulations of the quantum transformers on standard medical image datasets that showed competitive, and at times better, performance compared with the best classical transformers and other classical benchmarks. The computational complexity of our quantum attention layer proves to be advantageous compared with the classical algorithm with respect to the size of the classified images. Our quantum architectures have thousands of parameters compared with the best classical methods with millions of parameters. Finally, we have implemented our quantum transformers on superconducting quantum computers and obtained encouraging results for up to six qubit experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2022

Quantum Expectation Transformers for Cost Analysis

We introduce a new kind of expectation transformer for a mixed classical...
research
06/29/2023

MNISQ: A Large-Scale Quantum Circuit Dataset for Machine Learning on/for Quantum Computers in the NISQ era

We introduce the first large-scale dataset, MNISQ, for both the Quantum ...
research
03/29/2023

Quantum Deep Hedging

Quantum machine learning has the potential for a transformative impact a...
research
07/19/2022

Formal Algorithms for Transformers

This document aims to be a self-contained, mathematically precise overvi...
research
02/02/2023

Mnemosyne: Learning to Train Transformers with Transformers

Training complex machine learning (ML) architectures requires a compute ...
research
04/02/2021

LeViT: a Vision Transformer in ConvNet's Clothing for Faster Inference

We design a family of image classification architectures that optimize t...
research
08/06/2023

Average-Hard Attention Transformers are Constant-Depth Uniform Threshold Circuits

Transformers have emerged as a widely used neural network model for vari...

Please sign up or login with your details

Forgot password? Click here to reset