Optimizing quantum optimization algorithms via faster quantum gradient computation

11/01/2017
by   András Gilyén, et al.
0

We consider a generic framework of optimization algorithms based on gradient descent. We develop a quantum algorithm that computes the gradient of a multi-variate real-valued function f:R^d→R by evaluating it at only a logarithmic number of points in superposition. Our algorithm is an improved version of Jordan's gradient calculation algorithm, providing an approximation of the gradient ∇ f with quadratically better dependence on the evaluation accuracy of f, for an important class of smooth functions. Furthermore, we show that most objective functions arising from quantum optimization procedures satisfy the necessary smoothness conditions, hence our algorithm provides a quadratic improvement in the complexity of computing their gradient. We also show that in a continuous phase-query model, our gradient computation algorithm has optimal query complexity up to poly-logarithmic factors, for a particular class of smooth functions. Moreover, we show that for low-degree multivariate polynomials our algorithm can provide exponential speedups compared to Jordan's algorithm in terms of the dimension d. One of the technical challenges in applying our gradient computation procedure for quantum optimization problems is the need to convert between a probability oracle (which is common in quantum optimization procedures) and a phase oracle (which is common in quantum algorithms) of the objective function f. We provide efficient subroutines to perform this delicate interconversion between the two types of oracles incurring only a logarithmic overhead, which might be of independent interest. Finally, using these tools we improve the runtime of prior approaches for training quantum auto-encoders, variational quantum eigensolvers, and quantum approximate optimization algorithms (QAOA).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset