On Curvature-aided Incremental Aggregated Gradient Methods

05/31/2018
by   Hoi-To Wai, et al.
0

This paper studies an acceleration technique for incremental aggregated gradient methods which exploits curvature information for solving strongly convex finite sum optimization problems. These optimization problems of interest arise in large-scale learning applications relevant to machine learning systems. The proposed methods utilizes a novel curvature-aided gradient tracking technique to produce gradient estimates using the aids of Hessian information during computation. We propose and analyze two curvature-aided methods --- the first method, called curvature-aided incremental aggregated gradient (CIAG) method, can be developed from the standard gradient method and it computes an ϵ-optimal solution using O( κ ( 1 / ϵ ) ) iterations for a small ϵ; the second method, called accelerated CIAG (A-CIAG) method, incorporates Nesterov's acceleration into CIAG and requires O( √(κ) ( 1 / ϵ ) ) iterations for a small ϵ, where κ is the problem's condition number. Importantly, the asymptotic convergence rates above are the same as those of the full gradient and accelerated full gradient methods, respectively, and they are independent of the number of component functions involved. The proposed methods are significantly faster than the state-of-the-art methods, especially for large-scale problems with a massive amount of data. The source codes are available at https://github.com/hoitowai/ciag/

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2017

Curvature-aided Incremental Aggregated Gradient Method

We propose a new algorithm for finite sum optimization which we call the...
research
03/22/2018

SUCAG: Stochastic Unbiased Curvature-aided Gradient Method for Distributed Optimization

We propose and analyze a new stochastic gradient method, which we call S...
research
10/11/2019

General Proximal Incremental Aggregated Gradient Algorithms: Better and Novel Results under General Scheme

The incremental aggregated gradient algorithm is popular in network opti...
research
05/26/2016

FLAG n' FLARE: Fast Linearly-Coupled Adaptive Gradient Methods

We consider first order gradient methods for effectively optimizing a co...
research
06/07/2018

Towards Riemannian Accelerated Gradient Methods

We propose a Riemannian version of Nesterov's Accelerated Gradient algor...
research
05/15/2018

Local Saddle Point Optimization: A Curvature Exploitation Approach

Gradient-based optimization methods are the most popular choice for find...
research
04/13/2011

Hybrid Deterministic-Stochastic Methods for Data Fitting

Many structured data-fitting applications require the solution of an opt...

Please sign up or login with your details

Forgot password? Click here to reset