Accelerated Quasi-Newton Proximal Extragradient: Faster Rate for Smooth Convex Optimization

06/03/2023
by   Ruichen Jiang, et al.
0

In this paper, we propose an accelerated quasi-Newton proximal extragradient (A-QPNE) method for solving unconstrained smooth convex optimization problems. With access only to the gradients of the objective, we prove that our method can achieve a convergence rate of O(min{1/k^2, √(dlog k)/k^2.5}), where d is the problem dimension and k is the number of iterations. In particular, in the regime where k = O(d), our method matches the optimal rate of O(1/k^2) by Nesterov's accelerated gradient (NAG). Moreover, in the the regime where k = Ω(d log d), it outperforms NAG and converges at a faster rate of O(√(dlog k)/k^2.5). To the best of our knowledge, this result is the first to demonstrate a provable gain of a quasi-Newton-type method over NAG in the convex setting. To achieve such results, we build our method on a recent variant of the Monteiro-Svaiter acceleration framework and adopt an online learning perspective to update the Hessian approximation matrices, in which we relate the convergence rate of our method to the dynamic regret of a specific online convex optimization problem in the space of matrices.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset