Global linear convergence of Newton's method without strong-convexity or Lipschitz gradients

06/01/2018
by   Sai Praneeth Karimireddy, et al.
0

We show that Newton's method converges globally at a linear rate for objective functions whose Hessians are stable. This class of problems includes many functions which are not strongly convex, such as logistic regression. Our linear convergence result is (i) affine-invariant, and holds even if an (ii) approximate Hessian is used, and if the subproblems are (iii) only solved approximately. Thus we theoretically demonstrate the superiority of Newton's method over first-order methods, which would only achieve a sublinear O(1/t^2) rate under similar conditions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset