Newton Sketch: A Linear-time Optimization Algorithm with Linear-Quadratic Convergence

05/09/2015
by   Mert Pilanci, et al.
0

We propose a randomized second-order method for optimization known as the Newton Sketch: it is based on performing an approximate Newton step using a randomly projected or sub-sampled Hessian. For self-concordant functions, we prove that the algorithm has super-linear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and related problem-dependent quantities. Given a suitable initialization, similar guarantees also hold for strongly convex and smooth objectives without self-concordance. When implemented using randomized projections based on a sub-sampled Hadamard basis, the algorithm typically has substantially lower complexity than Newton's method. We also describe extensions of our methods to programs involving convex constraints that are equipped with self-concordant barriers. We discuss and illustrate applications to linear programs, quadratic programs with convex constraints, logistic regression and other generalized linear models, as well as semidefinite programs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2021

Adaptive Newton Sketch: Linear-time Optimization with Quadratic Convergence and Effective Hessian Dimensionality

We propose a randomized algorithm with quadratic convergence rate for co...
research
03/21/2019

OverSketched Newton: Fast Convex Optimization for Serverless Systems

Motivated by recent developments in serverless systems for large-scale m...
research
10/11/2019

Fast and Furious Convergence: Stochastic Second Order Methods under Interpolation

We consider stochastic second order methods for minimizing strongly-conv...
research
07/03/2019

Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses

In this paper, we study large-scale convex optimization algorithms based...
research
04/29/2014

Randomized Sketches of Convex Programs with Sharp Guarantees

Random projection (RP) is a classical technique for reducing storage and...
research
07/02/2016

Sub-sampled Newton Methods with Non-uniform Sampling

We consider the problem of finding the minimizer of a convex function F:...
research
02/17/2020

A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization

We demonstrate how to scalably solve a class of constrained self-concord...

Please sign up or login with your details

Forgot password? Click here to reset