Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method

08/28/2023
by   Nikita Doikov, et al.
0

We study the composite convex optimization problems with a Quasi-Self-Concordant smooth component. This problem class naturally interpolates between classic Self-Concordant functions and functions with Lipschitz continuous Hessian. Previously, the best complexity bounds for this problem class were associated with trust-region schemes and implementations of a ball-minimization oracle. In this paper, we show that for minimizing Quasi-Self-Concordant functions we can use instead the basic Newton Method with Gradient Regularization. For unconstrained minimization, it only involves a simple matrix inversion operation (solving a linear system) at each step. We prove a fast global linear rate for this algorithm, matching the complexity bound of the trust-region scheme, while our method remains especially simple to implement. Then, we introduce the Dual Newton Method, and based on it, develop the corresponding Accelerated Newton Scheme for this problem class, which further improves the complexity factor of the basic method. As a direct consequence of our results, we establish fast global linear rates of simple variants of the Newton Method applied to several practical problems, including Logistic Regression, Soft Maximum, and Matrix Scaling, without requiring additional assumptions on strong or uniform convexity for the target objective.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2017

Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods

We study the smooth structure of convex functions by generalizing a powe...
research
08/11/2022

Super-Universal Regularized Newton Method

We analyze the performance of a variant of Newton method with quadratic ...
research
02/17/2020

A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization

We demonstrate how to scalably solve a class of constrained self-concord...
research
08/27/2020

Interior-point methods for unconstrained geometric programming and scaling problems

We provide a condition-based analysis of two interior-point methods for ...
research
03/03/2022

Certified Newton schemes for the evaluation of low-genus theta functions

Theta functions and theta constants in low genus, especially genus 1 and...
research
07/06/2021

Unifying Width-Reduced Methods for Quasi-Self-Concordant Optimization

We provide several algorithms for constrained optimization of a large cl...

Please sign up or login with your details

Forgot password? Click here to reset