The Average Rate of Convergence of the Exact Line Search Gradient Descent Method

05/16/2023
by   Thomas Yu, et al.
0

It is very well-known that when the exact line search gradient descent method is applied to a convex quadratic objective, the worst case rate of convergence (among all seed vectors) deteriorates as the condition number of the Hessian of the objective grows. By an elegant analysis by H. Akaike, it is generally believed – but not proved – that in the ill-conditioned regime the ROC for almost all initial vectors, and hence also the average ROC, is close to the worst case ROC. We complete Akaike's analysis using the theorem of center and stable manifolds. Our analysis also makes apparent the effect of an intermediate eigenvalue in the Hessian by establishing the following somewhat amusing result: In the absence of an intermediate eigenvalue, the average ROC gets arbitrarily fast – not slow – as the Hessian gets increasingly ill-conditioned. We discuss in passing some contemporary applications of exact line search GD to polynomial optimization problems arising from imaging and data sciences.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset