Convergence rates of sub-sampled Newton methods

08/12/2015
by   Murat A. Erdogdu, et al.
0

We consider the problem of minimizing a sum of n functions over a convex parameter set C⊂R^p where n≫ p≫ 1. In this regime, algorithms which utilize sub-sampling techniques are known to be effective. In this paper, we use sub-sampling techniques together with low-rank approximation to design a new randomized batch algorithm which possesses comparable convergence rate to Newton's method, yet has much smaller per-iteration cost. The proposed algorithm is robust in terms of starting point and step size, and enjoys a composite convergence rate, namely, quadratic convergence at start and linear convergence when the iterate is close to the minimizer. We develop its theoretical analysis which also allows us to select near-optimal algorithm parameters. Our theoretical results can be used to obtain convergence rates of previously proposed sub-sampling based algorithms as well. We demonstrate how our results apply to well-known machine learning problems. Lastly, we evaluate the performance of our algorithm on several datasets under various scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/18/2016

Sub-Sampled Newton Methods II: Local Convergence Rates

Many data-fitting applications require the solution of an optimization p...
research
11/28/2015

Newton-Stein Method: An optimization method for GLMs via Stein's Lemma

We consider the problem of efficiently computing the maximum likelihood ...
research
07/02/2016

Sub-sampled Newton Methods with Non-uniform Sampling

We consider the problem of finding the minimizer of a convex function F:...
research
10/25/2019

Convergence Analysis of the Randomized Newton Method with Determinantal Sampling

We analyze the convergence rate of the Randomized Newton Method (RNM) in...
research
05/28/2021

Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions

Generalized self-concordance is a key property present in the objective ...
research
08/26/2020

Convergence Rate Improvement of Richardson and Newton-Schulz Iterations

Fast convergent, accurate, computationally efficient, parallelizable, an...
research
05/30/2023

Adaptive Quasi-Newton and Anderson Acceleration Framework with Explicit Global (Accelerated) Convergence Rates

Despite the impressive numerical performance of quasi-Newton and Anderso...

Please sign up or login with your details

Forgot password? Click here to reset