Analyzing Convergence in Quantum Neural Networks: Deviations from Neural Tangent Kernels

03/26/2023
by   Xuchen You, et al.
0

A quantum neural network (QNN) is a parameterized mapping efficiently implementable on near-term Noisy Intermediate-Scale Quantum (NISQ) computers. It can be used for supervised learning when combined with classical gradient-based optimizers. Despite the existing empirical and theoretical investigations, the convergence of QNN training is not fully understood. Inspired by the success of the neural tangent kernels (NTKs) in probing into the dynamics of classical neural networks, a recent line of works proposes to study over-parameterized QNNs by examining a quantum version of tangent kernels. In this work, we study the dynamics of QNNs and show that contrary to popular belief it is qualitatively different from that of any kernel regression: due to the unitarity of quantum operations, there is a non-negligible deviation from the tangent kernel regression derived at the random initialization. As a result of the deviation, we prove the at-most sublinear convergence for QNNs with Pauli measurements, which is beyond the explanatory power of any kernel regression dynamics. We then present the actual dynamics of QNNs in the limit of over-parameterization. The new dynamics capture the change of convergence rate during training and implies that the range of measurements is crucial to the fast QNN convergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2021

Towards understanding the power of quantum kernels in the NISQ era

A key problem in the field of quantum computing is understanding whether...
research
05/25/2022

A Convergence Theory for Over-parameterized Variational Quantum Eigensolvers

The Variational Quantum Eigensolver (VQE) is a promising candidate for q...
research
05/05/2021

Training Quantum Embedding Kernels on Near-Term Quantum Computers

Kernel methods are a cornerstone of classical machine learning. The idea...
research
11/28/2021

Neural Tangent Kernel of Matrix Product States: Convergence and Applications

In this work, we study the Neural Tangent Kernel (NTK) of Matrix Product...
research
08/23/2022

Exponential concentration and untrainability in quantum kernel methods

Kernel methods in Quantum Machine Learning (QML) have recently gained si...
research
09/26/2019

Information Scrambling in Quantum Neural Networks

Quantum neural networks are one of the promising applications for near-t...
research
02/23/2021

Classifying high-dimensional Gaussian mixtures: Where kernel methods fail and neural networks succeed

A recent series of theoretical works showed that the dynamics of neural ...

Please sign up or login with your details

Forgot password? Click here to reset