Quark: A Gradient-Free Quantum Learning Framework for Classification Tasks

10/02/2022
by   Zhihao Zhang, et al.
0

As more practical and scalable quantum computers emerge, much attention has been focused on realizing quantum supremacy in machine learning. Existing quantum ML methods either (1) embed a classical model into a target Hamiltonian to enable quantum optimization or (2) represent a quantum model using variational quantum circuits and apply classical gradient-based optimization. The former method leverages the power of quantum optimization but only supports simple ML models, while the latter provides flexibility in model design but relies on gradient calculation, resulting in barren plateau (i.e., gradient vanishing) and frequent classical-quantum interactions. To address the limitations of existing quantum ML methods, we introduce Quark, a gradient-free quantum learning framework that optimizes quantum ML models using quantum optimization. Quark does not rely on gradient computation and therefore avoids barren plateau and frequent classical-quantum interactions. In addition, Quark can support more general ML models than prior quantum ML methods and achieves a dataset-size-independent optimization complexity. Theoretically, we prove that Quark can outperform classical gradient-based methods by reducing model query complexity for highly non-convex problems; empirically, evaluations on the Edge Detection and Tiny-MNIST tasks show that Quark can support complex ML models and significantly reduce the number of measurements needed for discovering near-optimal weights for these tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/07/2021

Information-theoretic bounds on quantum advantage in machine learning

We study the complexity of training classical and quantum machine learni...
research
02/03/2022

An Empirical Review of Optimization Techniques for Quantum Variational Circuits

Quantum Variational Circuits (QVCs) are often claimed as one of the most...
research
03/02/2023

Quantum Hamiltonian Descent

Gradient descent is a fundamental algorithm in both theory and practice ...
research
03/01/2022

Beyond Ansätze: Learning Quantum Circuits as Unitary Operators

This paper explores the advantages of optimizing quantum circuits on N w...
research
09/07/2021

Optimizing Quantum Variational Circuits with Deep Reinforcement Learning

Quantum Machine Learning (QML) is considered to be one of the most promi...
research
11/04/2022

Reservoir Computing via Quantum Recurrent Neural Networks

Recent developments in quantum computing and machine learning have prope...
research
04/08/2022

Quantum Machine Learning Framework for Virtual Screening in Drug Discovery: a Prospective Quantum Advantage

Machine Learning (ML) for Ligand Based Virtual Screening (LB-VS) is an i...

Please sign up or login with your details

Forgot password? Click here to reset