Interior Point Methods with a Gradient Oracle

04/10/2023
by   Adrian Vladu, et al.
0

We provide an interior point method based on quasi-Newton iterations, which only requires first-order access to a strongly self-concordant barrier function. To achieve this, we extend the techniques of Dunagan-Harvey [STOC '07] to maintain a preconditioner, while using only first-order information. We measure the quality of this preconditioner in terms of its relative excentricity to the unknown Hessian matrix, and we generalize these techniques to convex functions with a slowly-changing Hessian. We combine this with an interior point method to show that, given first-order access to an appropriate barrier function for a convex set K, we can solve well-conditioned linear optimization problems over K to ε precision in time O((𝒯+n^2)√(nν)log(1/ε)), where ν is the self-concordance parameter of the barrier function, and 𝒯 is the time required to make a gradient query. As a consequence we show that: ∙ Linear optimization over n-dimensional convex sets can be solved in time O((𝒯n+n^3)log(1/ε)). This parallels the running time achieved by state of the art algorithms for cutting plane methods, when replacing separation oracles with first-order oracles for an appropriate barrier function. ∙ We can solve semidefinite programs involving m≥ n matrices in ℝ^n× n in time O(mn^4+m^1.25n^3.5log(1/ε)), improving over the state of the art algorithms, in the case where m=Ω(n^3.5/ω-1.25). Along the way we develop a host of tools allowing us to control the evolution of our potential functions, using techniques from matrix analysis and Schur convexity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset