Avoiding Barren Plateaus with Classical Deep Neural Networks

05/26/2022
by   Lucas Friedrich, et al.
0

Variational quantum algorithms (VQAs) are among the most promising algorithms in the era of Noisy Intermediate Scale Quantum Devices. The VQAs are applied to a variety of tasks, such as in chemistry simulations, optimization problems, and quantum neural networks. Such algorithms are constructed using a parameterization U(θ) with a classical optimizer that updates the parameters θ in order to minimize a cost function C. For this task, in general the gradient descent method, or one of its variants, is used. This is a method where the circuit parameters are updated iteratively using the cost function gradient. However, several works in the literature have shown that this method suffers from a phenomenon known as the Barren Plateaus (BP). This phenomenon is characterized by the exponentially flattening of the cost function landscape, so that the number of times the function must be evaluated to perform the optimization grows exponentially as the number of qubits and parameterization depth increase. In this article, we report on how the use of a classical neural networks in the VQAs input parameters can alleviate the BP phenomenon.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2022

Restricting to the chip architecture maintains the quantum neural network accuracy, if the parameterization is a 2-design

In the era of noisy intermediate scale quantum devices, variational quan...
research
05/17/2022

Natural evolutionary strategies applied to quantum-classical hybrid neural networks

With the rapid development of quantum computers, several applications ar...
research
05/05/2022

LAWS: Look Around and Warm-Start Natural Gradient Descent for Quantum Neural Networks

Variational quantum algorithms (VQAs) have recently received significant...
research
11/24/2020

Effect of barren plateaus on gradient-free optimization

Barren plateau landscapes correspond to gradients that vanish exponentia...
research
05/20/2016

Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes

In artificial neural networks, learning from data is a computationally d...
research
01/17/2023

The quantum cost function concentration dependency on the parametrization expressivity

Although we are currently in the era of noisy intermediate scale quantum...
research
02/12/2023

Quantum Neuron Selection: Finding High Performing Subnetworks With Quantum Algorithms

Gradient descent methods have long been the de facto standard for traini...

Please sign up or login with your details

Forgot password? Click here to reset