Super-Universal Regularized Newton Method

08/11/2022
by   Nikita Doikov, et al.
0

We analyze the performance of a variant of Newton method with quadratic regularization for solving composite convex minimization problems. At each step of our method, we choose regularization parameter proportional to a certain power of the gradient norm at the current point. We introduce a family of problem classes characterized by Hölder continuity of either the second or third derivative. Then we present the method with a simple adaptive search procedure allowing an automatic adjustment to the problem class with the best global complexity bounds, without knowing specific parameters of the problem. In particular, for the class of functions with Lipschitz continuous third derivative, we get the global O(1/k^3) rate, which was previously attributed to third-order tensor methods. When the objective function is uniformly convex, we justify an automatic acceleration of our scheme, resulting in a faster global rate and local superlinear convergence. The switching between the different rates (sublinear, linear, and superlinear) is automatic. Again, for that, no a priori knowledge of parameters is needed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2023

Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method

We study the composite convex optimization problems with a Quasi-Self-Co...
research
12/03/2021

Regularized Newton Method with Global O(1/k^2) Convergence

We present a Newton-type method that converges fast from any initializat...
research
03/14/2017

Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods

We study the smooth structure of convex functions by generalizing a powe...
research
09/05/2023

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

In this work, we develop first-order (Hessian-free) and zero-order (deri...
research
05/30/2022

Optimal and Adaptive Monteiro-Svaiter Acceleration

We develop a variant of the Monteiro-Svaiter (MS) acceleration framework...
research
01/05/2021

On the Local convergence of two-step Newton type Method in Banach Spaces under generalized Lipschitz Conditions

The motive of this paper is to discuss the local convergence of a two-st...
research
11/08/2017

Learning Sparse Visual Representations with Leaky Capped Norm Regularizers

Sparsity inducing regularization is an important part for learning over-...

Please sign up or login with your details

Forgot password? Click here to reset