A Simple Practical Accelerated Method for Finite Sums

02/08/2016
by   Aaron Defazio, et al.
1

We describe a novel optimization method for finite sums (such as empirical risk minimization problems) building on the recently introduced SAGA method. Our method achieves an accelerated convergence rate on strongly convex smooth problems. Our method has only one parameter (a step size), and is radically simpler than other accelerated methods for finite sums. Additionally it can be applied when the terms are non-smooth, yielding a method applicable in many areas where operator splitting methods would traditionally be applied.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2015

Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

We propose an optimization method for minimizing the finite sums of smoo...
research
06/29/2019

Conjugate Gradients and Accelerated Methods Unified: The Approximate Duality Gap View

This note provides a novel, simple analysis of the method of conjugate g...
research
12/14/2019

A subspace-accelerated split Bregman method for sparse data recovery with joint l1-type regularizers

We propose a subspace-accelerated Bregman method for the linearly constr...
research
10/25/2016

On the convergence rate of the three operator splitting scheme

The three operator splitting scheme was recently proposed by [Davis and ...
research
08/01/2023

Anderson Accelerated PMHSS for Complex-Symmetric Linear Systems

This paper presents the design and development of an Anderson Accelerate...
research
10/19/2021

Accelerated Graph Learning from Smooth Signals

We consider network topology identification subject to a signal smoothne...
research
07/10/2014

Finito: A Faster, Permutable Incremental Gradient Method for Big Data Problems

Recent advances in optimization theory have shown that smooth strongly c...

Please sign up or login with your details

Forgot password? Click here to reset