Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

06/06/2022
by   Michał Dereziński, et al.
0

Stochastic variance reduction has proven effective at accelerating first-order algorithms for solving convex finite-sum optimization tasks such as empirical risk minimization. Incorporating additional second-order information has proven helpful in further improving the performance of these first-order methods. However, comparatively little is known about the benefits of using variance reduction to accelerate popular stochastic second-order methods such as Subsampled Newton. To address this, we propose Stochastic Variance-Reduced Newton (SVRN), a finite-sum minimization algorithm which enjoys all the benefits of second-order methods: simple unit step size, easily parallelizable large-batch operations, and fast local convergence, while at the same time taking advantage of variance reduction to achieve improved convergence rates (per data pass) for smooth and strongly convex problems. We show that SVRN can accelerate many stochastic second-order methods (such as Subsampled Newton) as well as iterative least squares solvers (such as Iterative Hessian Sketch), and it compares favorably to popular first-order methods with variance reduction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2022

Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods

This work proposes a universal and adaptive second-order method for mini...
research
01/19/2022

Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized Learning: Part I

In this work, we investigate stochastic quasi-Newton methods for minimiz...
research
07/14/2018

On the Acceleration of L-BFGS with Second-Order Information and Stochastic Batches

This paper proposes a framework of L-BFGS based on the (approximate) sec...
research
06/21/2017

Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations

We present novel minibatch stochastic optimization methods for empirical...
research
05/23/2018

Approximate Newton-based statistical inference using only stochastic gradients

We present a novel inference framework for convex empirical risk minimiz...
research
04/11/2019

Modified online Newton step based on element wise multiplication

The second order method as Newton Step is a suitable technique in Online...
research
05/22/2017

Large Scale Empirical Risk Minimization via Truncated Adaptive Newton Method

We consider large scale empirical risk minimization (ERM) problems, wher...

Please sign up or login with your details

Forgot password? Click here to reset