Exponential Error Convergence in Data Classification with Optimized Random Features: Acceleration by Quantum Machine Learning

06/16/2021
by   Hayata Yamasaki, et al.
0

Random features are a central technique for scalable learning algorithms based on kernel methods. A recent work has shown that an algorithm for machine learning by quantum computer, quantum machine learning (QML), can exponentially speed up sampling of optimized random features, even without imposing restrictive assumptions on sparsity and low-rankness of matrices that had limited applicability of conventional QML algorithms; this QML algorithm makes it possible to significantly reduce and provably minimize the required number of features for regression tasks. However, a major interest in the field of QML is how widely the advantages of quantum computation can be exploited, not only in the regression tasks. We here construct a QML algorithm for a classification task accelerated by the optimized random features. We prove that the QML algorithm for sampling optimized random features, combined with stochastic gradient descent (SGD), can achieve state-of-the-art exponential convergence speed of reducing classification error in a classification task under a low-noise condition; at the same time, our algorithm with optimized random features can take advantage of the significant reduction of the required number of features so as to accelerate each iteration in the SGD and evaluation of the classifier obtained from our algorithm. These results discover a promising application of QML to significant acceleration of the leading classification algorithm based on kernel methods, without ruining its applicability to a practical class of data sets and the exponential error-convergence speed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2020

Fast Quantum Algorithm for Learning with Optimized Random Features

Kernel methods augmented with random features give scalable algorithms f...
research
11/13/2019

Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features

Although kernel methods are widely used in many learning problems, they ...
research
12/01/2020

Quantum-Inspired Classical Algorithm for Slow Feature Analysis

Recently, there has been a surge of interest for quantum computation for...
research
12/04/2020

A Variant of Gradient Descent Algorithm Based on Gradient Averaging

In this work, we study an optimizer, Grad-Avg to optimize error function...
research
04/17/2023

Fast and Straggler-Tolerant Distributed SGD with Reduced Computation Load

In distributed machine learning, a central node outsources computational...
research
03/06/2023

Towards provably efficient quantum algorithms for large-scale machine-learning models

Large machine learning models are revolutionary technologies of artifici...
research
09/17/2020

Analysis of the Convergence Speed of the Arimoto-Blahut Algorithm by the Second Order Recurrence Formula

In this paper, we investigate the convergence speed of the Arimoto-Blahu...

Please sign up or login with your details

Forgot password? Click here to reset