Trainability of Dissipative Perceptron-Based Quantum Neural Networks

05/26/2020
by   Kunal Sharma, et al.
0

Several architectures have been proposed for quantum neural networks (QNNs), with the goal of efficiently performing machine learning tasks on quantum data. Rigorous scaling results are urgently needed for specific QNN constructions to understand which, if any, will be trainable at a large scale. Here, we analyze the gradient scaling (and hence the trainability) for a recently proposed architecture that we called dissipative QNNs (DQNNs), where the input qubits of each layer are discarded at the layer's output. We find that DQNNs can exhibit barren plateaus, i.e., gradients that vanish exponentially in the number of qubits. Moreover, we provide quantitative bounds on the scaling of the gradient for DQNNs under different conditions, such as different cost functions and circuit depths, and show that trainability is not always guaranteed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2020

Absence of Barren Plateaus in Quantum Convolutional Neural Networks

Quantum neural networks (QNNs) have generated excitement around the poss...
research
10/27/2021

Subtleties in the trainability of quantum machine learning models

A new paradigm for data science has emerged, with quantum data, quantum ...
research
01/27/2023

Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks with Quantum Computation

Ridgelet transform has been a fundamental mathematical tool in the theor...
research
04/15/2023

Learning To Optimize Quantum Neural Network Without Gradients

Quantum Machine Learning is an emerging sub-field in machine learning wh...
research
10/06/2021

QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks

The advent of noisy intermediate-scale quantum (NISQ) computers raises a...
research
05/16/2022

Power and limitations of single-qubit native quantum neural networks

Quantum neural networks (QNNs) have emerged as a leading strategy to est...
research
06/14/2022

Scaling ResNets in the Large-depth Regime

Deep ResNets are recognized for achieving state-of-the-art results in co...

Please sign up or login with your details

Forgot password? Click here to reset