q-means: A quantum algorithm for unsupervised machine learning

12/10/2018
by   Iordanis Kerenidis, et al.
0

Quantum machine learning is one of the most promising applications of a full-scale quantum computer. Over the past few years, many quantum machine learning algorithms have been proposed that can potentially offer considerable speedups over the corresponding classical algorithms. In this paper, we introduce q-means, a new quantum algorithm for clustering which is a canonical problem in unsupervised machine learning. The q-means algorithm has convergence and precision guarantees similar to k-means, and it outputs with high probability a good approximation of the k cluster centroids like the classical algorithm. Given a dataset of N d-dimensional vectors v_i (seen as a matrix V ∈R^N × d) stored in QRAM, the running time of q-means is O( k d η/δ^2κ(V)(μ(V) + k η/δ) + k^2 η^1.5/δ^2κ(V)μ(V) ) per iteration, where κ(V) is the condition number, μ(V) is a parameter that appears in quantum linear algebra procedures and η = _i ||v_i||^2. For a natural notion of well-clusterable datasets, the running time becomes O( k^2 d η^2.5/δ^3 + k^2.5η^2/δ^3) per iteration, which is linear in the number of features d, and polynomial in the rank k, the maximum square norm η and the error parameter δ. Both running times are only polylogarithmic in the number of datapoints N. Our algorithm provides substantial savings compared to the classical k-means algorithm that runs in time O(kdN) per iteration, particularly for the case of large datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset